-
Notifications
You must be signed in to change notification settings - Fork 4k
Description
Bug Report
Description
When a tool call has state: "output-error" with input: undefined and rawInput set to a string (because the model generated invalid JSON for the tool input), convertToModelMessages sends the raw string as the tool_use.input value to Anthropic.
Anthropic's API requires tool_use.input to be a JSON object/dictionary, so it rejects the request with:
400: messages.N.content.M.tool_use.input: Input should be a valid dictionary
This creates an unrecoverable loop in multi-turn conversations: the corrupted tool call is part of the message history, so every subsequent inference attempt replays the same invalid message and gets the same 400 error.
Root Cause
In convertToModelMessages, when processing tool UI parts with state === "output-error":
input: part.state === "output-error"
? (_a = part.input) != null ? _a : "rawInput" in part ? part.rawInput : void 0
: part.input,When part.input is undefined (as set during streaming when JSON parsing fails), it falls back to part.rawInput — which is a string, not an object. Anthropic (and likely other providers) requires input to be a dict/object.
Reproduction Steps
- Use
streamTextwith Anthropic and tools that have structured input schemas - The model generates a tool call with syntactically invalid JSON in the input (e.g., an unquoted string value like
"to": John Doe <john@example.com>) - The AI SDK correctly catches the parse error and creates a part with
state: "output-error",input: undefined,rawInput: "<the invalid JSON string>" - On the next turn (or retry),
convertToModelMessagesconverts this part back to a model message withinputset to the raw string - Anthropic returns
400: messages.N.content.M.tool_use.input: Input should be a valid dictionary - The conversation is now permanently stuck — every retry replays the same broken history
Expected Behavior
convertToModelMessages should ensure tool_use.input is always a valid object when converting output-error parts. Options:
- Attempt to parse
rawInputas JSON; if it fails, use{}or{ "raw": rawInput } - Always use
{}as a fallback wheninputis undefined, regardless ofrawInput
Related
- Cleanup error message state in UI messages #11154 — tracks the broader cleanup of error message state (v6/v7 TODO)
- LLM passing escaped double quotes in tool arguments result in error #11719 — LLM generating invalid JSON in tool arguments
Environment
aiSDK version: 5.x (latest)- Provider:
@ai-sdk/anthropic - Runtime: Node.js (AWS Lambda)