Comments (17)
This issue has been fixed in React, and will be provided with the upcoming Next.js release. You can verify that in your repro with these commands:
npm add ai zod next@canary react@beta react-dom@beta --legacy-peer-deps
npm run build && npm start
Reference:
Related:
from ai.
Just looking in from the outside, it appears that the team is currently in the process of preparing the release of Next.js 15, which will include React 19. The aforementioned React fixes have been integrated here: vercel/next.js#65058
from ai.
I tested in my own app based on https://github.com/vercel/ai-chatbot where i'm using function_calling to call external APIs to get back a response. Does it have to be the repo above?
No, but if the update doesn't help your case it's possible that you have a different situation. In this case it might make sense to create a separate issue with your own minimal reproduction.
from ai.
confirming this on nextjs 14.2.3
from ai.
Same here. I should also add that nested streamable UIs were working in Next.js 14.1.3
from ai.
Same issue with AI 3.1.9 and Next 14.2.2.
My product is not yet on production, so that's OK, but that's bummer if you rely on this feature 😅
I hope there will be a fix soon. Let us know if we can provide more details.
from ai.
Same thing here... locally it works fine not sure what the issue is. If someone is willing to drop a hint I'd be welcome to help take a look
from ai.
Thank you so much! You're a hero 🫡
from ai.
Thanks @unstubbable, that's awesome. Can you point to the Next.js release that we'll have to upgrade to? Will it be 14.3.0 or a minor release of 14.2?
from ai.
hey @unstubbable i actually tested by upgrading to your suggested versions
npm add ai zod next@canary react@beta react-dom@beta --legacy-peer-deps
npm run build && npm start
however, this not only doesn't solve it, the issue now appears locally.. 🤔 any ideas here?
from ai.
Did you test it with https://github.com/amcclure-super/ai-streamableUI, or somewhere else?
from ai.
I tested in my own app based on https://github.com/vercel/ai-chatbot where i'm using function_calling to call external APIs to get back a response. Does it have to be the repo above?
from ai.
I'm still a bit confused with this one. Should we wait for Next.js 15 to be released or is there a suitable workaround for this? It's still in the docs, so it's supposedly supported, right?
from ai.
OK so it seems to work on AI SDK 3.1.x. But when you upgrade to 3.2.x, it breaks again.
Any ideas how we can fix that? That's quite annoying as it's in the docs, it's working fine locally and breaks only when deploying to production...
from ai.
Are you running next release candidate @lionelrudaz ? 3.2.x is working for me in production
from ai.
Hey @amcclure-super, no I'm using Next 14.2.4. I had to specify this version as it was working with it.
What's your Next version?
from ai.
I'm trying to upgrade to 3.3.3, and I still have exactly the same issue. Upgraded to Next 14.2.5, still the same.
You can see it live on my test server: https://release.tasters.ch/
Tell me if you need me to share some pieces of code.
from ai.
Related Issues (20)
- When using streamUI, I need to be able to update the ui from onFinish HOT 1
- When using tools, I get a 'ui stream is already closed' HOT 3
- experimental_useObject to include attachments HOT 2
- ReferenceError: performance is not defined HOT 3
- Functions import from OpenAPI spec
- Get the traceId, (telematry data) sent back from function result. HOT 1
- When providing image message to any groq model, it throws `message[0].content must be a string` HOT 4
- Invalid data content. Content string is not a base64-encoded media. HOT 2
- Provide provider chat id and timestamp HOT 1
- Large amounts of Suspense leading to browser crashing/max call stack error HOT 1
- Prompt Token Usage not accurate when using Anthropic Prompt Caching HOT 1
- add tool roundtrips to streamUI HOT 6
- Example code using AI State breaks when sending follow-up user message in conversation
- useChat maxRoundTrips breaks in 3.3.21 HOT 6
- Enable dynamic handling of maxTokens limits across embed() models with 1. a utility to count tokens and 2. options to automatically truncate or 3. chunk provided strings HOT 1
- infer correct return types when using `output: 'array'` with `experimental_useObject` HOT 4
- How to Call API Request without using handleSubmit and add More inputs HOT 3
- Expose `PartialObject` type HOT 6
- `experimental_useObject` isLoading doesn't turns false after error HOT 2
- Support for Gemini pdf file types HOT 4
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ai.