Worker - Streaming : it takes quite a long time for the client to get the first data: is it normal ?

I am following the example in the documentation: https://developers.cloudflare.com/workers-ai/models/llama-2-7b-chat-int8/#code-examples The code I past in my browser to log event from the worker. When I run it, I see the first log after ~6 seconds. This seems quite a long period of time, but is it normal ? Thanks !
const source = new EventSource("/"); // Workers AI streaming endpoint
source.onmessage = (event) => {
if (event.data == "[DONE]") {
source.close();
return;
}
const data = JSON.parse(event.data);
console.log(data.response);
}
const source = new EventSource("/"); // Workers AI streaming endpoint
source.onmessage = (event) => {
if (event.data == "[DONE]") {
source.close();
return;
}
const data = JSON.parse(event.data);
console.log(data.response);
}
Cloudflare Docs
llama-2-7b-chat-int8 · Cloudflare Workers AI docs
Run AI models in Workers, Pages, or via API.
1 Reply
Cyb3r-Jak3
Cyb3r-Jak39mo ago
Yeah I’d think it’s normal because the AI model has to run to generate the content
Want results from more Discord servers?
Add your server