Using streamText with a custom API instead of a known model provider #4070
-
Hi everyone, I’m working on a project where I’d like to use the streamText function from the AI SDK, but not with any of the standard providers like OpenAI or Anthropic. Instead, I want to connect it to a custom API that behaves similarly to OpenAI’s streaming endpoint. Here’s a brief overview of my setup:
My goal is that, in "chatWithAgent" function, I want to use "streamText" to handle the streaming response from agent-handler (Web Service B). However, instead of passing a standard provider (e.g., OpenAI), I need to simulate the behavior of a model using my custom API (agent-handler). The current streamText API seems to require a model object, but in my case, I don’t have a traditional model. Instead, I want to stream data directly from my agent-handler service. Is there a way to integrate streamText with a custom API like mine? If not, are there alternative approaches to achieve this within the AI SDK? Here’s what my chatWithAgent function ideally looks like: export const chatWithAgent = async (): Promise<StreamTextResult<Record<string, CoreTool>>> => {
...
const { textStream } = streamText({
model: {
// Instead of using a standard model, I want to use my custom agent-handler service here
async doStream() {
return await fetch(`URL_AGENT_HANDLER/chat/completions`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
messages: chatMessages,
model: 'gpt-3.5-turbo',
stream: true,
temperature: 0.7,
max_tokens: 1000
}),
});
},
},
messages: chatMessages,
onCompletion: async (completion: string) => {
// Update conversation once the response is fully received
},
});
...
}; Is it possible to use streamText without a standard provider, and instead connect it to a custom API that streams data? If yes, how can I configure the model object to make this work? Any guidance, examples, or alternative approaches would be greatly appreciated! Thank you in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
If your endpoint is fully OpenAI compatible, you can use the openai provider and set a custom baseURL. See https://sdk.vercel.ai/providers/openai-compatible-providers |
Beta Was this translation helpful? Give feedback.
If your endpoint is fully OpenAI compatible, you can use the openai provider and set a custom baseURL. See https://sdk.vercel.ai/providers/openai-compatible-providers