vllm worker OpenAI stream
Hi everyone,
I followed the Runpod documentation to create a simple OpenAI client code using a serverless endpoint for the Llava model (llava-hf/llava-1.5-7b-hf). However, I encountered the following error:
Has anyone experienced this issue? Any suggestions for resolving it?
Code:
1 Reply
Hugging Face Forums
Chat_template is not set & throwing error
This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.