Answer Overflow Logo
Change Theme
Search Answer Overflow
GitHub
Add Your Server
Login
Home
Popular
Topics
Gaming
Programming
Nickbkl
Posts
Comments
R
RunPod
•
Created by Nickbkl on 12/11/2024 in
#⚡|serverless
Running llama 3.3 70b using vLLM and 160gb network volume
Hi, I want to check if 160 gb is enough for llama 70b and whether I can use use a smaller network volume
49 replies