How to Stream responses when using Langchain in the new route handlers
Im trying to build a ChatGPT app using langchain, but I cant figure out how to stream the responses to the client.
This is how I have initialized openai with langchain
I can get the stream from the
handleLLMNewToken
callback. handleLLMNewToken
returns a string for every token(word) generated.
Can someone please help me on how to stream this to the client using the new route handlers?0 Replies