Thorsten
Thorsten
RRunPod
Created by SATAN on 5/22/2024 in #⚡|serverless
OutOfMemory
Same issue here, trying to deploy llama-3-70B and other LLM, all are erroring out with OutOfMemory error. Even when using the highest GPU tier.
34 replies