Skip to content

fix: add support for kimi-k2.5 and kimi-for-coding models#167

Open
tianling536 wants to merge 1 commit intodataelement:mainfrom
tianling536:fix-kimi-k2.5-temperature
Open

fix: add support for kimi-k2.5 and kimi-for-coding models#167
tianling536 wants to merge 1 commit intodataelement:mainfrom
tianling536:fix-kimi-k2.5-temperature

Conversation

@tianling536
Copy link

@tianling536 tianling536 commented Mar 22, 2026

Description

This PR adds support for Kimi K2.5 and Kimi for Coding models.

1. kimi-k2.5 temperature fix

Kimi k2.5 models only support temperature=1. When passing other values, the API returns:

"invalid temperature: only 1 is allowed for this model"

Fix: Automatically sets temperature=1.0 when model name contains 'kimi-k2.5' or 'kimi-k2-5'.

2. kimi-for-coding support

Kimi for Coding uses Anthropic Messages API format instead of OpenAI format.

Added: New 'kimi-coding' provider with:

  • Protocol: anthropic (Anthropic Messages API)
  • Base URL: https://api.kimi.com/coding
  • Endpoint: /v1/messages
  • Default model: k2p5
  • Max tokens: 32768
  • No anthropic-version header required

3. Endpoint path fix

Custom kimi endpoints (api.kimi.com/coding/) need /v1/chat/completions instead of /chat/completions.

Fix: Added _get_chat_endpoint() method to handle special cases.

Changes

  • Modified: backend/app/services/llm_client.py

Configuration

For kimi-k2.5

  • Provider: Kimi (Moonshot)
  • Model: kimi-k2.5

For kimi-for-coding

This PR adds support for Kimi K2.5 and Kimi for Coding models:

1. **kimi-k2.5 temperature fix**: Force temperature=1 for kimi-k2.5 models
   as they only support this value (API error: "invalid temperature: only 1 is allowed")

2. **kimi-for-coding support**: Add new 'kimi-coding' provider using Anthropic protocol
   - Base URL: https://api.kimi.com/coding
   - Uses /v1/messages endpoint (Anthropic format)
   - Model ID: k2p5
   - No anthropic-version header required

3. **Endpoint path fix**: Handle kimi.com/coding endpoint specially to use
   /v1/chat/completions instead of /chat/completions

Changes:
- Modified: backend/app/services/llm_client.py
@tianling536 tianling536 force-pushed the fix-kimi-k2.5-temperature branch from 76ceba9 to 2538782 Compare March 22, 2026 15:43
@tianling536 tianling536 changed the title fix: force temperature=1 for kimi-k2.5 models fix: add support for kimi-k2.5 and kimi-for-coding models Mar 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant