aurelium
aurelium
RRunPod
Created by aurelium on 7/29/2024 in #⚡|serverless
Is the vLLM worker updated for LLaMA3.1 yet?
If not, is anyone aware of a good serverless container that does support it?
5 replies