fix(provider): auto-enable interleaved for reasoning models#24218
fix(provider): auto-enable interleaved for reasoning models#24218fkyah3 wants to merge 1 commit intoanomalyco:devfrom
Conversation
…k/openai-compatible
When a user configures a model with reasoning: true but doesn't explicitly
set interleaved, the default was false. This caused reasoning_content to be
dropped during message replay for @ai-sdk/openai-compatible users.
Now, if reasoning: true is configured and interleaved is not explicitly set,
it defaults to { field: "reasoning_content" } instead of false.
|
Thanks for your contribution! This PR doesn't have a linked issue. All PRs must reference an existing issue. Please:
See CONTRIBUTING.md for details. |
|
The following comment was made by an LLM, it may be inaccurate: Based on the search results, here are the potentially related PRs: Related PRs Found:
These PRs are related to reasoning models, interleaved content handling, and DeepSeek thinking mode - the same areas addressed by PR #24218. However, they appear to be focused on different aspects of the problem or may have been earlier attempts to handle similar issues. |
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
|
@fkyah3 i tried your pr and iam using deepseek v4 pro using openrouter and i still have the issue. |
|
@Avanatiker Can you share your opencode.json provider/model config for OpenRouter? Specifically, which provider type are you using: The PR #24218 only fixes the If you're using Even with this fix, there's a second layer to the bug: the Our fork has the full fix at commit
Let me know your config and I can help narrow down which layer is failing for you. |
|
hey! thanks for the quick reply. iam using the built in openrouter provider and just set up my api key. iam not using any specific config. ill try your fork next :) |
|
@Avanatiker Got it — built-in OpenRouter provider means If you try the fork and it works, that confirms the issue is in the Let me know how the fork test goes! |
|
@Avanatiker I found the issue. Our fork (fkyah3) has a bug for OpenRouter specifically — the Workaround that should work: Apply PR #24218 alone (NOT the fork). The upstream normalizeMessages hardcodes If #24218 doesn't work for you either, there's something deeper going on with OpenRouter's message format. Can you share the exact error text from your screenshot? I'll fix the key issue in our fork too. |
|
Iam confused. i tried this PR first?
|
|
@Avanatiker Sorry for the confusion! You're right to test PR #24218 first — and that also failing tells us something important. The fact that BOTH PR #24218 and our fork fail for OpenRouter strongly suggests the issue is deeper — possibly in how Can you help me isolate the issue?
|
|
he claims its fixed. but i tried last commit.. #24203 (comment) |
|
@Avanatiker I checked issue #24203. rekram1-node closed it because a partial fix (PR #24146) was merged. But that fix only covers a specific code path — not enough for OpenRouter users. The issue for OpenRouter users is actually two separate bugs:
You tested earlier before our second fix was in. If you want to try again:
Can't guarantee it'll work (we don't have OpenRouter to test ourselves), but it should be better than before. I'll post an update here once we confirm. |
|
are you sure you pushed your changes? latest commit is 48d1bdf2c90317cd46b3c48aede8d058e6f91096 |
…ti-turn conversations
Fixes the two-layer bug where reasoning_content is dropped on conversation
replay for DeepSeek thinking mode and OpenRouter-routed DeepSeek models.
Three changes:
1. provider.ts: Auto-enable interleaved for reasoning models
- When model.reasoning is true but interleaved is not explicitly set,
default to { field: "reasoning_content" } instead of false
- This triggers the interleaved transform that extracts reasoning
and passes it via providerOptions
2. transform.ts: Use dynamic SDK key in interleaved transform
- Replace hardcoded "openaiCompatible" with sdkKey(model.api.npm)
- Fixes OpenRouter provider which expects "openrouter" key, not
"openaiCompatible" (prevents key mismatch in providerOptions)
3. transform.ts: Inject reasoning_content for ALL assistant messages
- New fallback transform fires when capabilities.reasoning is true
- Sets reasoning_content: "" in providerOptions for every assistant
message, including historical messages stored before reasoning mode
was enabled (no reasoning part to extract from)
- Also expands DeepSeek detection to check model.id in addition to
model.api.id, covering OpenRouter-routed DeepSeek models
Closes anomalyco#24104
Related: anomalyco#24203 (OpenRouter users still affected by PR anomalyco#24218 alone)
Supersedes partial fix from PR anomalyco#24146 (merged but incomplete)
|
@fkyah3 Seems your fork is private now? |
Repository moved. The fork has been renamed from opencode-fkyah3 to opencode-yg because the AI kept misinterpreting the author name in file paths. |



Issue for this PR
Closes #24104 (related)
Type of change
What does this PR do?
When using @ai-sdk/openai-compatible with a model configured with reasoning: true, the interleaved capability defaulted to false. This caused reasoning_content in providerOptions to be silently dropped during message replay.
The fix is a one-line change in provider.ts: if model.interleaved is not explicitly set and no existing model data provides it, check model.reasoning and default to { field: "reasoning_content" } instead of false.
How did you verify your code works?
Screenshots / recordings
N/A
Checklist