Comments (2)
You can edit the examples inside the fork pretty easily (using pnpm build
on root and pnpm build
in packages/core
). You can use linking mechanism for projects outside.
from ai.
I managed to build the examples and test my changes there. But I had no luck in linking the library with my changes (only svelte/use-chat.ts was changed) into my project. Can you be more specific in how to do this?
Also have you experienced any of the problems I've seen with useChat in svelte?
Below is my use-chat.ts if someone else is having the same problems as I've had:
import { type Readable, type Writable, get, writable } from 'svelte/store';
import { callChatApi } from '../shared/call-chat-api';
import { processChatStream } from '../shared/process-chat-stream';
import type {
ChatRequest,
ChatRequestOptions,
CreateMessage,
IdGenerator,
JSONValue,
Message,
UseChatOptions,
} from 'ai';
import { generateId as generateIdFunc } from '../shared/generate-id';
export type { CreateMessage, Message, UseChatOptions };
export type UseChatHelpers = {
messages: Readable<Message[]>;
error: Readable<undefined | Error>;
append: (
message: Message | CreateMessage,
chatRequestOptions?: ChatRequestOptions,
) => Promise<string | null | undefined>;
reload: (
chatRequestOptions?: ChatRequestOptions,
) => Promise<string | null | undefined>;
stop: () => void;
setMessages: (messages: Message[]) => void;
input: Writable<string>;
handleSubmit: (e: any, chatRequestOptions?: ChatRequestOptions) => void;
metadata?: Object;
isLoading: Readable<boolean | undefined>;
data: Readable<JSONValue[] | undefined>;
};
const getStreamedResponse = async (
api: string,
chatRequest: ChatRequest,
mutate: (messages: Message[]) => void,
mutateStreamData: (data: JSONValue[] | undefined) => void,
existingData: JSONValue[] | undefined,
extraMetadata: {
credentials?: RequestCredentials;
headers?: Record<string, string> | Headers;
body?: any;
},
previousMessages: Message[],
abortControllerRef: AbortController | null,
generateId: IdGenerator,
streamMode?: 'stream-data' | 'text',
onFinish?: (message: Message) => void,
onResponse?: (response: Response) => void | Promise<void>,
sendExtraMessageFields?: boolean,
) => {
mutate(chatRequest.messages);
const constructedMessagesPayload = sendExtraMessageFields
? chatRequest.messages
: chatRequest.messages.map(
({ role, content, name, function_call, tool_calls, tool_call_id }) => ({
role,
content,
tool_call_id,
...(name !== undefined && { name }),
...(function_call !== undefined && {
function_call: function_call,
}),
...(tool_calls !== undefined && {
tool_calls: tool_calls,
}),
}),
);
return await callChatApi({
api,
messages: constructedMessagesPayload,
body: {
...extraMetadata.body,
...chatRequest.options?.body,
...(chatRequest.functions !== undefined && {
functions: chatRequest.functions,
}),
...(chatRequest.function_call !== undefined && {
function_call: chatRequest.function_call,
}),
...(chatRequest.tools !== undefined && {
tools: chatRequest.tools,
}),
...(chatRequest.tool_choice !== undefined && {
tool_choice: chatRequest.tool_choice,
}),
},
streamMode,
credentials: extraMetadata.credentials,
headers: {
...extraMetadata.headers,
...chatRequest.options?.headers,
},
abortController: () => abortControllerRef,
restoreMessagesOnFailure() {
mutate(previousMessages);
},
onResponse,
onUpdate(merged, data) {
mutate([...chatRequest.messages, ...merged]);
mutateStreamData([...(existingData || []), ...(data || [])]);
},
onFinish,
generateId,
});
};
const chatStores = new Map<string, UseChatHelpers>();
export function useChat({
api = '/api/chat',
id,
initialMessages = [],
initialInput = '',
sendExtraMessageFields,
experimental_onFunctionCall,
experimental_onToolCall,
streamMode,
onResponse,
onFinish,
onError,
credentials,
headers,
body,
generateId = generateIdFunc,
}: UseChatOptions = {}): UseChatHelpers {
const chatId = id || `chat-${generateId()}`;
if (chatStores.has(chatId)) {
return chatStores.get(chatId) as UseChatHelpers;
}
const messages = writable<Message[]>(initialMessages);
const streamData = writable<JSONValue[] | undefined>(undefined);
const isLoading = writable<boolean>(false);
const error = writable<undefined | Error>(undefined);
const input = writable(initialInput);
const mutate = (data: Message[]) => {
messages.set(data);
};
let abortController: AbortController | null = null;
const extraMetadata = {
credentials,
headers,
body,
};
async function triggerRequest(chatRequest: ChatRequest) {
try {
error.set(undefined);
isLoading.set(true);
abortController = new AbortController();
await processChatStream({
getStreamedResponse: () =>
getStreamedResponse(
api,
chatRequest,
mutate,
data => {
streamData.set(data);
},
get(streamData),
extraMetadata,
get(messages),
abortController,
generateId,
streamMode,
onFinish,
onResponse,
sendExtraMessageFields,
),
experimental_onFunctionCall,
experimental_onToolCall,
updateChatRequest: chatRequestParam => {
chatRequest = chatRequestParam;
},
getCurrentMessages: () => get(messages),
});
abortController = null;
return null;
} catch (err) {
if ((err as any).name === 'AbortError') {
abortController = null;
return null;
}
if (onError && err instanceof Error) {
onError(err);
}
error.set(err as Error);
} finally {
isLoading.set(false);
}
}
const append: UseChatHelpers['append'] = async (
message: Message | CreateMessage,
{
options,
functions,
function_call,
tools,
tool_choice,
}: ChatRequestOptions = {},
) => {
if (!message.id) {
message.id = generateId();
}
const chatRequest: ChatRequest = {
messages: get(messages).concat(message as Message),
options,
...(functions !== undefined && { functions }),
...(function_call !== undefined && { function_call }),
...(tools !== undefined && { tools }),
...(tool_choice !== undefined && { tool_choice }),
};
return triggerRequest(chatRequest);
};
const reload: UseChatHelpers['reload'] = async ({
options,
functions,
function_call,
tools,
tool_choice,
}: ChatRequestOptions = {}) => {
const messagesSnapshot = get(messages);
if (messagesSnapshot.length === 0) return null;
const lastMessage = messagesSnapshot.at(-1);
if (lastMessage?.role === 'assistant') {
const chatRequest: ChatRequest = {
messages: messagesSnapshot.slice(0, -1),
options,
...(functions !== undefined && { functions }),
...(function_call !== undefined && { function_call }),
...(tools !== undefined && { tools }),
...(tool_choice !== undefined && { tool_choice }),
};
return triggerRequest(chatRequest);
}
const chatRequest: ChatRequest = {
messages: messagesSnapshot,
options,
...(functions !== undefined && { functions }),
...(function_call !== undefined && { function_call }),
...(tools !== undefined && { tools }),
...(tool_choice !== undefined && { tool_choice }),
};
return triggerRequest(chatRequest);
};
const stop = () => {
if (abortController) {
abortController.abort();
abortController = null;
}
};
const setMessages = (messages: Message[]) => {
mutate(messages);
};
const handleSubmit = (e: any, options: ChatRequestOptions = {}) => {
e.preventDefault();
const inputValue = get(input);
if (!inputValue) return;
append(
{
content: inputValue,
role: 'user',
createdAt: new Date(),
},
options,
);
input.set('');
};
const chatHelpers: UseChatHelpers = {
messages,
error,
append,
reload,
stop,
setMessages,
input,
handleSubmit,
isLoading,
data: streamData,
};
chatStores.set(chatId, chatHelpers);
return chatHelpers;
}
from ai.
Related Issues (20)
- Bedrock client does not work natively with profiles, instance roles, container roles, etc. HOT 1
- Add Ai21Lab Provider @ai-sdk/ai21 for Jamba 1.5 Models HOT 1
- streamObject breaks with undefined (Mistral provider) HOT 4
- OpenAI Model tries to call unavailable tool `multi_tool_use.parallel` HOT 2
- Add JigsawStack SDK as part of the AI SDK HOT 3
- Type error when using model map HOT 2
- Adding .optional() or .partial() to the parent zod schema object with OpenAIs Structured output leads to error HOT 1
- Unexpected Termination of HTTP Connection with createstreamableUI in Concurrent WebSocket Sessions HOT 1
- generateObject fails with any Perplexity api models HOT 3
- `useObject` doesn't work properly when the LLM throws an error HOT 1
- Sveltekit handleSubmit sends back data as undefined HOT 1
- Tools call is triggered, but the new tools format does not execute the code. HOT 6
- When using streamUI, I need to be able to update the ui from onFinish HOT 1
- When using tools, I get a 'ui stream is already closed' HOT 3
- experimental_useObject to include attachments HOT 2
- ReferenceError: performance is not defined HOT 3
- Functions import from OpenAPI spec
- Get the traceId, (telematry data) sent back from function result. HOT 1
- When providing image message to any groq model, it throws `message[0].content must be a string` HOT 4
- Invalid data content. Content string is not a base64-encoded media. HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ai.