Cloudflare Pages + SvelteKit streaming responses

Hi everyone, I'm struggling to stream my response using Cloudflare Pages (Workers underneath for dynamic requests) + SvelteKit. I created a minimal reproducible example to showcase the issue. stream is mocking an object implementing a LLM. When this endpoint is called, the response is buffered and delivered as a whole, after all characters are read, with no Transfer-Encoding: chunked. When run locally with Vite, the response is streamed correctly. I referenced other topics in this forum (https://discord.com/channels/595317990191398933/1152905087266586635) to create this example, which I'm sure it should be working. Does anyone have any idea what to fix in below code? Thanks in advance 😉
import type { RequestEvent, RequestHandler } from '@sveltejs/kit';

export const POST: RequestHandler = async ({ url }: RequestEvent) => {
const stream = createLetterStream();

const { readable, writable } = new TransformStream();

const writer = writable.getWriter();
const encoder = new TextEncoder();
const response = new Response(readable);

(async () => {
try {
for await (const chunk of stream as any) {
writer.write(encoder.encode(chunk));
}
writer.close();
} catch (e) {
return new Response('Error while reading stream', { status: 500 });
}
})();

return response;
};

const createLetterStream = () => {
let index = 0;
const text = 'Hello, World!';

return new ReadableStream<string>({
start(controller) {
const timer = setInterval(() => {
if (index < text.length) {
controller.enqueue(text.charAt(index));
index++;
} else {
controller.close();
clearInterval(timer);
}
}, 100);
},
});
};
import type { RequestEvent, RequestHandler } from '@sveltejs/kit';

export const POST: RequestHandler = async ({ url }: RequestEvent) => {
const stream = createLetterStream();

const { readable, writable } = new TransformStream();

const writer = writable.getWriter();
const encoder = new TextEncoder();
const response = new Response(readable);

(async () => {
try {
for await (const chunk of stream as any) {
writer.write(encoder.encode(chunk));
}
writer.close();
} catch (e) {
return new Response('Error while reading stream', { status: 500 });
}
})();

return response;
};

const createLetterStream = () => {
let index = 0;
const text = 'Hello, World!';

return new ReadableStream<string>({
start(controller) {
const timer = setInterval(() => {
if (index < text.length) {
controller.enqueue(text.charAt(index));
index++;
} else {
controller.close();
clearInterval(timer);
}
}, 100);
},
});
};
1 Reply
kunek
kunekOP13mo ago
Some additional analysis that I did on the deployed application: - the response does NOT have „Transfer-Encoding: chunked” when viewed in Chrome DevTools. It’s buffered and only received as a full message at the end. - in Safari, the response is streamed correctly! This really threw me off guard. Does anybody have any ideas about this difference in behavior?
Want results from more Discord servers?
Add your server