R
RunPod7mo ago
aurelium

Is the vLLM worker updated for LLaMA3.1 yet?

If not, is anyone aware of a good serverless container that does support it?
2 Replies
nerdylive
nerdylive7mo ago
Not yet lets wait vllm-worker from runpod to be updated i guess it'll be in a few more days or less
NERDDISCO
NERDDISCO6mo ago
It is updated, you can use it!

Did you find this page helpful?