anthropic: input_json_delta
tokens should not be passed to handleLLMNewToken
#6936
Open
5 tasks done
Labels
auto:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Checked other resources
Example Code
Introduced here: #6179 (comment), the Anthropic chat model with the tools would pass
input_json_delta
tokens tohandleLLMNewToken
resulting in the unwanted tokens being propagated to the callbacks.Output
The last three tokens are wrong because they are part of Anthropic's
input_json_delta
:For example, OpenAI behaves differently
Output
Interestingly enough, the behavior is not reproduced if using
.streamEvents
on the chat model, because arguments tokens are returned as part ofinput_json_delta
objects, but is reproduced when using.streamEvents
from the LangGraph agent, becausecontent
is returned as simplestring
in that case.Output (truncated)
Output (truncated)
I am not sure about the streaming behavior, but at least the callback behavior could be fixed by excluding
input
tokens here:langchainjs/libs/langchain-anthropic/src/chat_models.ts
Lines 167 to 174 in 660af3e
Error Message and Stack Trace (if applicable)
No response
Description
"Argumnets" tokens should not appear in the callbacks and when streaming with LangGraph
System Info
npm info langchain
The text was updated successfully, but these errors were encountered: