Next.js/Vercel AI/Streaming Issue
I want to make cloudflare aware of this issue. It seems using Cloudflare with Vercel's AI package (on Vercel hosting) if the proxy feature is on for Cloudflare steaming responses no longer works. https://github.com/vercel/ai/issues/633
GitHub
Stream Response does not work properly with Cloudflare DNS settings...
Description Issue In our production environment, the Stream Response doesn't seem to work as expected. Instead of streaming the response, it loads and immediately returns the full response. Int...
34 Replies
Just pinging this again, it seems Vercel is indicating it's on Cloudflare's end
same issue here, hope someone looks into this soon
Hey @someguy can you walk me through your setup and where this is failing please? Want to confirm what Vercel is saying
Using vercel preview domain (on their subdomain), streaming from OpenAI works (using Vercel/Vercel's AI package with their streaming commands)
Using the live site proxy'd behind CF using their IP in an A record (76.76.21.21) the stream appears to just return as one massive chunk, so functionally still works but streaming just seems to be off.
Oddly enough I have a second project that uses Vercel on a subdomain which uses a cname instead (using CF proxy also) and streaming works. So I wonder if it's an issue between A record/Cname
let me know if this helps!
Just tested that theory and it doesn't seem to matter if it's an A record/Cname, must be different project details
If you want a test account feel free to DM me, I turned the proxy off and can confirm it works with it off, then stops working with it on
weirdly enough they both projects use the same vercel call and function
interesting bug.... the CNAME website to the A rec website though?
Ignore the cname mention, it doesn't seem related
ok so the only difference is that 1 is through the vercel preview domain and the other is a custom zone you have with vercel?
The main difference is turning the CF proxy option off/on is that causes the issue
so I can only guess it's CF causing the issue
maybe... where is this CF proxy option you're interacting with? on vercel or is it when you change DNS to cloudflare?
This option
in Cloudflare
what's the website? can DM too if you dont want to talk about it publicly.
RightBlogger Blog
RightBlogger: AI-Powered Content Tools for Bloggers
RightBlogger is a collection of 50+ high-powered tools for bloggers to better research, create, optimize, and promote your content.
All functions that involve streaming require login (you can make a free account to submit one time per tool)
you'll see the response just is one large output rather than streaming
thanks, will try to repro and see what's going on
thank you!
Just here to say I have the exact same issue. Issue goes away when I disable the Cloudflare Proxy within Cloudflare. Any progress @ack ?
I have the same issue when using a Vercel CNAME within my Cloudflare DNS settings. So we can probably rule that out. My project is also on a subdomain, like app.somewebsite.com
@ack I have made a temporary reproduction here:
Cloudflare Proxy on: https://chat.jordyvandenaardweg.nl/
Cloudflare Proxy off: https://chat-proxy-off.jordyvandenaardweg.nl/
Not behind Cloudflare: https://nextjs-chat-cloudflare.vercel.app/
All URL's point to the same deployment.
Try it out with the prompt: what is ai?
That prompt gives a reasonable response length you can use to debug it
You'll notice the one behind the proxy is not streaming the responses, token by token, but just shows the response when it's done. So that takes a few seconds.
The one's NOT behind Cloudflare Proxy shows that streaming DOES work.
Please do not share these URLs, as i'm using a OpenAI key and each requests will add to my usage there. But feel free to use it for debugging.
The code is here, it used a Vercel AI Chatbot template, where I only removed the Auth/Github login and updated all the depedencies to their latest version:
https://github.com/jvandenaardweg/nextjs-chat-cloudflare
This is the API route that handles the incoming OpenAI stream: https://github.com/jvandenaardweg/nextjs-chat-cloudflare/blob/main/app/api/chat/route.ts
Just to clarify: I had streaming working on Vercel behind Cloudflare Proxy for months. But stopped working recently. According to messages I can find online from others with the same issue, it stopped working about 2 weeks ago.
Messages I could find with the same issue:
https://github.com/vercel/ai/issues/633
https://community.cloudflare.com/t/buffering-of-responses-did-not-used-to-be-a-problem-now-is/567306
https://community.cloudflare.com/t/streaming-edge-function-hosted-in-vercel-is-buffered-by-cloudflare-proxy/567635
https://github.com/vercel/ai/issues/239#issuecomment-1756185750
https://twitter.com/baptisteArno/status/1710282829483888786
https://twitter.com/thetrungvu/status/1712861296549409017
I've created a support ticket with the above reproduction info
Exactly the same issue on Typebot! It used to work but all of the sudden it started to Buffer the streaming response. (https://twitter.com/baptisteArno/status/1710282829483888786)
I'd say it's pretty critical as I have to disable Cloudflare proxy for now
Yeah same, current work around is disabling Cloudflare Proxy completely. Also tried to disable almost everything using Page Rules, but that did not work. The only way to get it working is the off switch in the DNS settings for Cloudflare Proxy
I'll keep you guys updated on the support ticket
Also opened a topic some time ago: https://discord.com/channels/595317990191398933/1159897955524935690
Thank you, it took me a while but I was able to repro. Can people see if they are still experiencing the issue now that a change has gone out?
Got a response back, they fixed it! See behaviour on reproduction: https://chat.jordyvandenaardweg.nl/ , the response now comes streaming in again. Also enabled Cloudflare Proxy on my production app, and all works again
I'll be removing the reproduction demo's later.
Thanks all!
woot
Thanks @ack
everything works now, thanks a lot to everyone involved!
Working for me woooo
Thanks!
Hi, I'm not using vercel AI, but these recent messages came up when I searched for streaming response issues. I'm unable to get a cloudflare worker to stream any responses using either readable stream or transform stream. Instead it's being buffered and sent as a single chunk once the response is complete. Are there specific configuration requirements? I've tried setting transfer-encoding header to chunked and said no-cache but it's still an issue. Is there a way to upgrade my workers or something so they can respond with streaming responses? Thanks
@swiecki i suggest to create a new topic here #general-help so others can help you with your specific issue ✌️
@swiecki were you able to find a solution for this? I'm losing my mind, spend hours thinking it's an issue with my implementation
@anurag here's what worked for me
Getting that exact issue once again on one of my domain. Disabling the proxy makes it work again
I submitted a ticket to Cloudflare support