R
RunPod3mo ago
aurelium

Is the vLLM worker updated for LLaMA3.1 yet?

If not, is anyone aware of a good serverless container that does support it?
2 Replies
nerdylive
nerdylive3mo ago
Not yet lets wait vllm-worker from runpod to be updated i guess it'll be in a few more days or less
NERDDISCO
NERDDISCO2mo ago
It is updated, you can use it!
Want results from more Discord servers?
Add your server