R
RunPod8mo ago
c

vllm

Any plans to update the vllm image worker? I would like to test Phi 3 and Llama 3.1, currently both are unsupported with the current image. (serverless)
2 Replies
clearlylacking
clearlylacking8mo ago
Gemma2 also doesn't seem to be supported
nerdylive
nerdylive8mo ago
Target: these weeks, some runpod staff said.. but he can't guarantee when exactly its being worked on

Did you find this page helpful?