Sunny
Sunny
CC#
Created by Sunny on 2/25/2024 in #help
Blazor App dotnet 8 WASM Render Mode with Azure AD(MS Entra)
No description
1 replies
CC#
Created by Sunny on 2/22/2024 in #help
✅Need help with streaming ChatGPT response to Blazor App
Hi, I have for a couple of days now been through this code without understanding why I can't get streaming to work. I have an API that stream data to my Blazor App client. The Blazor App is setup to run as webassembly only using <Routes @rendermode="new InteractiveWebAssemblyRenderMode(prerender: false)" /> and <HeadOutlet @rendermode="InteractiveWebAssembly" /> As you will see in the vid, I'm running in Kestrel not IIS Express 🙂 I would expect the Blazor App to begin processing the stream data and not wait for the stream to stop. This is a simplified version of my endpoint:
[HttpPost("chat/stream")]
public async IAsyncEnumerable<PartialCompletion> GetCompletion(
[FromBody] CompletionRequest request,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(
new ChatCompletionsOptions
{
DeploymentName = request.ModelId,
Messages =
{
new ChatRequestSystemMessage(this.options.Value.DefaultAssistant.SystemMessage),
new ChatRequestUserMessage(request.Prompt)
}
},
cancellationToken);

await foreach (var streamingChat in response.EnumerateValues().WithCancellation(cancellationToken))
{
this.logger.LogInformation("Received partial: {Partial}", streamingChat.ContentUpdate);
yield return new PartialCompletion(streamingChat.ContentUpdate);
await this.HttpContext.Response.Body.FlushAsync(cancellationToken);
await Task.Delay(100, cancellationToken);
}
}
[HttpPost("chat/stream")]
public async IAsyncEnumerable<PartialCompletion> GetCompletion(
[FromBody] CompletionRequest request,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(
new ChatCompletionsOptions
{
DeploymentName = request.ModelId,
Messages =
{
new ChatRequestSystemMessage(this.options.Value.DefaultAssistant.SystemMessage),
new ChatRequestUserMessage(request.Prompt)
}
},
cancellationToken);

await foreach (var streamingChat in response.EnumerateValues().WithCancellation(cancellationToken))
{
this.logger.LogInformation("Received partial: {Partial}", streamingChat.ContentUpdate);
yield return new PartialCompletion(streamingChat.ContentUpdate);
await this.HttpContext.Response.Body.FlushAsync(cancellationToken);
await Task.Delay(100, cancellationToken);
}
}
My frontend code could have been here due to slow mode you will find it in the comments. Anyone see the issue I can't see here?
3 replies