c
c
RRunPod
Created by c on 7/24/2024 in #⚡|serverless
vllm
Any plans to update the vllm image worker? I would like to test Phi 3 and Llama 3.1, currently both are unsupported with the current image. (serverless)
4 replies