C
C#10mo ago
Sunny

✅Need help with streaming ChatGPT response to Blazor App

Hi, I have for a couple of days now been through this code without understanding why I can't get streaming to work. I have an API that stream data to my Blazor App client. The Blazor App is setup to run as webassembly only using <Routes @rendermode="new InteractiveWebAssemblyRenderMode(prerender: false)" /> and <HeadOutlet @rendermode="InteractiveWebAssembly" /> As you will see in the vid, I'm running in Kestrel not IIS Express 🙂 I would expect the Blazor App to begin processing the stream data and not wait for the stream to stop. This is a simplified version of my endpoint:
[HttpPost("chat/stream")]
public async IAsyncEnumerable<PartialCompletion> GetCompletion(
[FromBody] CompletionRequest request,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(
new ChatCompletionsOptions
{
DeploymentName = request.ModelId,
Messages =
{
new ChatRequestSystemMessage(this.options.Value.DefaultAssistant.SystemMessage),
new ChatRequestUserMessage(request.Prompt)
}
},
cancellationToken);

await foreach (var streamingChat in response.EnumerateValues().WithCancellation(cancellationToken))
{
this.logger.LogInformation("Received partial: {Partial}", streamingChat.ContentUpdate);
yield return new PartialCompletion(streamingChat.ContentUpdate);
await this.HttpContext.Response.Body.FlushAsync(cancellationToken);
await Task.Delay(100, cancellationToken);
}
}
[HttpPost("chat/stream")]
public async IAsyncEnumerable<PartialCompletion> GetCompletion(
[FromBody] CompletionRequest request,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var response = await this.openAIClient.GetChatCompletionsStreamingAsync(
new ChatCompletionsOptions
{
DeploymentName = request.ModelId,
Messages =
{
new ChatRequestSystemMessage(this.options.Value.DefaultAssistant.SystemMessage),
new ChatRequestUserMessage(request.Prompt)
}
},
cancellationToken);

await foreach (var streamingChat in response.EnumerateValues().WithCancellation(cancellationToken))
{
this.logger.LogInformation("Received partial: {Partial}", streamingChat.ContentUpdate);
yield return new PartialCompletion(streamingChat.ContentUpdate);
await this.HttpContext.Response.Body.FlushAsync(cancellationToken);
await Task.Delay(100, cancellationToken);
}
}
My frontend code could have been here due to slow mode you will find it in the comments. Anyone see the issue I can't see here?
1 Reply
Sunny
SunnyOP10mo ago
And this is my Frontend code that should read the stream and update the UI while stream is running:
JsonSerializerOptions options = new(JsonSerializerDefaults.Web) { Converters = { new JsonStringEnumConverter() } };

var requestMessage = new HttpRequestMessage(HttpMethod.Post, "api/openai/chat/stream")
{
Content = JsonContent.Create(new CompletionRequest(selectedModel.ModelId, message), null, options)
};

var completionResponse = await HttpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead, CancellationTokenSource.Token);
if (!completionResponse.IsSuccessStatusCode)
{
Snackbar.Add("Failed to send message", Severity.Error);
return;
}

Messages.Add(new MessageItem(ParticipantRole.Assistant, DateTimeOffset.UtcNow, ""));

// Read the stream
Console.WriteLine("Reading the completion stream");
var stringBuilder = new StringBuilder();
using var stream = await completionResponse.Content.ReadAsStreamAsync(CancellationTokenSource.Token);
await foreach (var partialCompletion in JsonSerializer.DeserializeAsyncEnumerable<PartialCompletion>(stream, options, CancellationTokenSource.Token))
{
if (partialCompletion?.Content == null) continue;
Console.WriteLine(partialCompletion.Content);
stringBuilder.Append(partialCompletion.Content);

Messages[^1] = new MessageItem(ParticipantRole.Assistant, DateTimeOffset.UtcNow, stringBuilder.ToString().ToHtml());

await Task.Delay(100);
StateHasChanged();
}
finalResponse = stringBuilder.ToString().ToHtml();
Console.WriteLine("Completion stream ended");
JsonSerializerOptions options = new(JsonSerializerDefaults.Web) { Converters = { new JsonStringEnumConverter() } };

var requestMessage = new HttpRequestMessage(HttpMethod.Post, "api/openai/chat/stream")
{
Content = JsonContent.Create(new CompletionRequest(selectedModel.ModelId, message), null, options)
};

var completionResponse = await HttpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead, CancellationTokenSource.Token);
if (!completionResponse.IsSuccessStatusCode)
{
Snackbar.Add("Failed to send message", Severity.Error);
return;
}

Messages.Add(new MessageItem(ParticipantRole.Assistant, DateTimeOffset.UtcNow, ""));

// Read the stream
Console.WriteLine("Reading the completion stream");
var stringBuilder = new StringBuilder();
using var stream = await completionResponse.Content.ReadAsStreamAsync(CancellationTokenSource.Token);
await foreach (var partialCompletion in JsonSerializer.DeserializeAsyncEnumerable<PartialCompletion>(stream, options, CancellationTokenSource.Token))
{
if (partialCompletion?.Content == null) continue;
Console.WriteLine(partialCompletion.Content);
stringBuilder.Append(partialCompletion.Content);

Messages[^1] = new MessageItem(ParticipantRole.Assistant, DateTimeOffset.UtcNow, stringBuilder.ToString().ToHtml());

await Task.Delay(100);
StateHasChanged();
}
finalResponse = stringBuilder.ToString().ToHtml();
Console.WriteLine("Completion stream ended");
Finally got to the bottom of this issue. Created a new project to simplify things. This is the magic that fixed it: requestMessage.Options.Set(new HttpRequestOptionsKey<bool>("WebAssemblyEnableStreamingResponse"), true);
var requestMessage = new HttpRequestMessage(HttpMethod.Post, "api/values/stream");
requestMessage.Options.Set(new HttpRequestOptionsKey<bool>("WebAssemblyEnableStreamingResponse"), true); // Magic that enables streaming

Console.WriteLine("[Frontend] Sending request");
var response = await HttpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead);
Console.WriteLine("[Frontend] Request sent");

var stringBuilder = new StringBuilder();
Console.WriteLine("[Frontend] Deserializing stream");
await foreach (var dataToken in response.Content.ReadFromJsonAsAsyncEnumerable<DataToken>())
{
Console.WriteLine("[Frontend] Data received: " + dataToken?.Content);

if (dataToken == null) continue;

stringBuilder.Append(dataToken.Content);
this.message = stringBuilder.ToString();

StateHasChanged();
await Task.Delay(100);
}
Console.WriteLine("[Frontend] Stream deserialized");
var requestMessage = new HttpRequestMessage(HttpMethod.Post, "api/values/stream");
requestMessage.Options.Set(new HttpRequestOptionsKey<bool>("WebAssemblyEnableStreamingResponse"), true); // Magic that enables streaming

Console.WriteLine("[Frontend] Sending request");
var response = await HttpClient.SendAsync(requestMessage, HttpCompletionOption.ResponseHeadersRead);
Console.WriteLine("[Frontend] Request sent");

var stringBuilder = new StringBuilder();
Console.WriteLine("[Frontend] Deserializing stream");
await foreach (var dataToken in response.Content.ReadFromJsonAsAsyncEnumerable<DataToken>())
{
Console.WriteLine("[Frontend] Data received: " + dataToken?.Content);

if (dataToken == null) continue;

stringBuilder.Append(dataToken.Content);
this.message = stringBuilder.ToString();

StateHasChanged();
await Task.Delay(100);
}
Console.WriteLine("[Frontend] Stream deserialized");
Want results from more Discord servers?
Add your server