vllm
Any plans to update the vllm image worker? I would like to test Phi 3 and Llama 3.1, currently both are unsupported with the current image. (serverless)
2 Replies
Gemma2 also doesn't seem to be supported
Target: these weeks, some runpod staff said.. but he can't guarantee when exactly
its being worked on