Bug
In packages/opencode/src/provider/error.ts, the OR-chain that extracts a human-readable error message from a JSON response body short-circuits on OpenAI's standard 4xx response shape, causing the raw response body to be dumped to the user instead of the inner error.message.
Reproduction
Trigger any OpenAI 4xx error — for example by selecting a deprecated/restricted model — so the API returns:
{
"error": {
"message": "The model `gpt-5-codex` does not exist or you do not have access to it.",
"type": "invalid_request_error",
"code": "model_not_found"
}
}
The user sees:
APIError: Bad Request: {"error":{"message":"The model `gpt-5-codex` does not exist or you do not have access to it.","type":"invalid_request_error","code":"model_not_found"}}
…and on systems that pass the message through a redaction layer (telemetry, log scrubbers), even worse:
APIError: Bad Request: {?:?}
In production telemetry from a downstream fork, this caused users to retry the same broken model selection 3+ times in the same session because the surfaced error gave no actionable hint about what was wrong.
Root cause
packages/opencode/src/provider/error.ts:67:
const errMsg = body.message || body.error || body.error?.message
if (errMsg && typeof errMsg === "string") {
return `${msg}: ${errMsg}`
}
For OpenAI's {error: {message, type, code}} shape:
-
body.message → undefined (falsy)
-
body.error → the object {message, type, code} → truthy → assigned to errMsg
-
body.error?.message → never evaluated (short-circuit)
-
typeof errMsg === "string" → false (it's an object) → block skipped
- Falls through to line 89:
return \${msg}: ${e.responseBody}`` → raw structured body dumped
The sibling function parseStreamError in the same file already uses the correct defensive pattern at line 153 (typeof body?.error?.message === "string" ? ... : default); only parseAPICallError has this bug.
Fix
PR #25763 replaces the OR-chain with explicit-typeof ternaries so a truthy non-string at any level cannot block a valid string further down the chain. After the fix the user sees:
APIError: Bad Request: The model `gpt-5-codex` does not exist or you do not have access to it.
Tests added cover OpenAI nested shape, top-level message, plain-string error, missing-message fallback, and non-string short-circuit guard.
Bug
In
packages/opencode/src/provider/error.ts, the OR-chain that extracts a human-readable error message from a JSON response body short-circuits on OpenAI's standard 4xx response shape, causing the raw response body to be dumped to the user instead of the innererror.message.Reproduction
Trigger any OpenAI 4xx error — for example by selecting a deprecated/restricted model — so the API returns:
{ "error": { "message": "The model `gpt-5-codex` does not exist or you do not have access to it.", "type": "invalid_request_error", "code": "model_not_found" } }The user sees:
…and on systems that pass the message through a redaction layer (telemetry, log scrubbers), even worse:
In production telemetry from a downstream fork, this caused users to retry the same broken model selection 3+ times in the same session because the surfaced error gave no actionable hint about what was wrong.
Root cause
packages/opencode/src/provider/error.ts:67:For OpenAI's
{error: {message, type, code}}shape:body.message→undefined(falsy)body.error→ the object{message, type, code}→ truthy → assigned toerrMsgbody.error?.message→ never evaluated (short-circuit)typeof errMsg === "string"→false(it's an object) → block skippedreturn \${msg}: ${e.responseBody}`` → raw structured body dumpedThe sibling function
parseStreamErrorin the same file already uses the correct defensive pattern at line 153 (typeof body?.error?.message === "string" ? ... : default); onlyparseAPICallErrorhas this bug.Fix
PR #25763 replaces the OR-chain with explicit-typeof ternaries so a truthy non-string at any level cannot block a valid string further down the chain. After the fix the user sees:
Tests added cover OpenAI nested shape, top-level
message, plain-stringerror, missing-messagefallback, and non-string short-circuit guard.