Comments (6)
So I got it working with a custom AIStream, I basically just copied their OpenAIStream and adapted it slightly. The functions calls are now prompted out.
Note, of course that would require further parsing
import { AIStream, trimStartOfStreamHelper, type AIStreamCallbacks } from "ai";
function parseOpenAIStream(): (data: string) => string | void {
const trimStartOfStream = trimStartOfStreamHelper();
return (data) => {
// TODO: Needs a type
const json = JSON.parse(data);
// this can be used for either chat or completion models
const text = trimStartOfStream(
json.choices[0]?.delta?.function_call
? JSON.stringify(json.choices[0]?.delta?.function_call)
: json.choices[0]?.delta?.content ?? json.choices[0]?.text ?? ""
);
return text;
};
}
export function OpenAIStream(
res: Response,
cb?: AIStreamCallbacks
): ReadableStream {
return AIStream(res, parseOpenAIStream(), cb);
}
from ai.
I am also checking in here because of this.
So 2 things I found
- it seems openai returns function calls in streams https://community.openai.com/t/function-calls-and-streaming/263393/2
- this SDK uses its AIStream in which the customParser is only looking at delta.content.text so it swallows the possibility to stream the function call response
To your question regarding JSON closing tags, I implemented a month ago a streaming plugin aware chatbot. What you are looking for is basically a parser that whenever you see tokens that technically matches json in its correct syntax, you buffer them, until you have a full valid JSON, and than you yield that whole thing.
To make it easier, I used prompt engineering to wrap any json into a |START| and |END| tag. This way I can more easily start buffering
So in the end you will have a stream that looks like
Of
course
let
me
check
the
plugin
|START| -- start buffering now
.buffer: {
.buffer: { function_call
.buffer: { function_call: check_the_weather, arguments ...}
|END| -- set the model with a stop word to equal |END|
Anyways that is a completely different implementation than open ai suggests, as the function calls from them cannot come along with text from the AI. But still I thought worth sharing
from ai.
Thanks! It looks like streaming function calls is relatively easy, but it's still challenging to parse partial arguments.
from ai.
I looked at the results and they do it quite clever. I like the way that it makes sure that one chunk is always a valid json. So you could put into the parser just some annotation while the function call is happening, and then merging the stuff.
i got this out of the stream
{"name":"get_current_weather","arguments":""}
{"arguments":"{\n"}
{"arguments":" "}
{"arguments":" \""}
{"arguments":"location"}
{"arguments":"\":"}
{"arguments":" \""}
{"arguments":"Berlin"}
{"arguments":"\"\n"}
{"arguments":"}"}
So what you need to do is just merging the stuff together. In the arguments example it would be the logic of just concatinating the string
from ai.
Here is an updated version i just monkey coded together, might help
import { AIStream, trimStartOfStreamHelper, type AIStreamCallbacks } from "ai";
function parseOpenAIStream(): (data: string) => string | void {
const trimStartOfStream = trimStartOfStreamHelper();
let currentFunctionCall: {
name: string;
arguments: string[];
} | null = null;
return (data) => {
// TODO: Needs a type
const json = JSON.parse(data);
if (json.choices[0]?.delta?.function_call) {
if (!currentFunctionCall) {
currentFunctionCall = {
name: json.choices[0].delta.function_call.name,
arguments: [json.choices[0].delta.function_call.arguments],
};
} else {
currentFunctionCall.arguments.push(
json.choices[0].delta.function_call.arguments
);
}
}
if (json.choices[0]?.finish_reason === "function_call") {
const functionCall = currentFunctionCall;
currentFunctionCall = null;
return JSON.stringify({
function_call: functionCall?.name,
arguments: JSON.parse(functionCall?.arguments.join("") ?? "[]"),
});
}
// this can be used for either chat or completion models
const text = trimStartOfStream(
json.choices[0]?.delta?.content ?? json.choices[0]?.text ?? ""
);
return text;
};
}
export function OpenAIStream(
res: Response,
cb?: AIStreamCallbacks
): ReadableStream {
return AIStream(res, parseOpenAIStream(), cb);
}
Unfortunately worth nothing, this parser has side effects, as it holds a buffer outside. There are better solutions.
anyways, it buffers the full function call inside an object, and only if it is complete, it will be streamed back to you.
from ai.
I just put up a PR that allows for streaming function responses to be streamed back to clients (who then can parse the JSON once the response is finished). #154
from ai.
Related Issues (20)
- Feature Request: Support for Multiple Completions in OpenAI API Integration
- Using OpenAI SDK with Ollama HOT 2
- Using Antropic with the new Generative UI HOT 3
- Option to still disable data streaming if I just need the text HOT 1
- Annotations array empty with OpenAI and AssistantResponse and useAssistant HOT 3
- Expose a `OpenAIAssistantStream` function
- Super weird bug (not updating) only on Vercel / deployed code HOT 2
- [AI Core] Abort signal causes `BodyStreamBuffer was aborted` HOT 1
- Pages Router AI SDK 404 error HOT 2
- Error in Gemini Chat Template HOT 2
- JSON parsing error - possible fixes
- useAIState with key not correctly inferring type
- [AsistantResponse] - to save streaming data to external database HOT 8
- Router cache issue with NextJS and Vercel AI SDK
- Omit/Modify Conversation History When Triggering Request in useChat
- Importing from "langchain/llms/openai" is deprecated. HOT 2
- Better documentation for `streamObject`
- Allow Unchecked Schema for Tool Calls and object generations. Make Zod optional.
- Groq API Support? HOT 6
- `createStreamableValue()` in `initialUIState` causes infinite loading HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ai.