✅Need help with streaming ChatGPT response to Blazor App
Hi,
I have for a couple of days now been through this code without understanding why I can't get streaming to work.
I have an API that stream data to my Blazor App client. The Blazor App is setup to run as webassembly only using
<Routes @rendermode="new InteractiveWebAssemblyRenderMode(prerender: false)" />
and <HeadOutlet @rendermode="InteractiveWebAssembly" />
As you will see in the vid, I'm running in Kestrel not IIS Express 🙂
I would expect the Blazor App to begin processing the stream data and not wait for the stream to stop.
This is a simplified version of my endpoint:
My frontend code could have been here due to slow mode you will find it in the comments.
Anyone see the issue I can't see here?1 Reply
And this is my Frontend code that should read the stream and update the UI while stream is running:
Finally got to the bottom of this issue.
Created a new project to simplify things.
This is the magic that fixed it:
requestMessage.Options.Set(new HttpRequestOptionsKey<bool>("WebAssemblyEnableStreamingResponse"), true);