Nickbkl
Nickbkl
RRunPod
Created by Nickbkl on 12/11/2024 in #⚡|serverless
Running llama 3.3 70b using vLLM and 160gb network volume
Hi, I want to check if 160 gb is enough for llama 70b and whether I can use use a smaller network volume
49 replies