Emad
RRunPod
•Created by Emad on 8/19/2024 in #⚡|serverless
LLAMA 3.1 8B Model Cold Start and Delay time very long
30 replies
RRunPod
•Created by Emad on 8/8/2024 in #⚡|serverless
Can't run a 70B Llama 3.1 model on 2 A100 80 gb GPUs.
Hey, so I tired running the 70B llama model on 2gpu/worker but it keeps getting stuck at the same place every time but instead if I switch to the 8B model on 1 gpu/worker with a 48gb GPU, it works easily. The issue is coming with the 70B paramater model on 2 gpus/worker.
67 replies
RRunPod
•Created by Emad on 4/5/2024 in #⚡|serverless
Balance Disappeared
Hi, I had an account on Run Pod a long while ago before all the UI changed. I had a Pod as well. I recently logged back into my account and all the UI was changed. More importantly my serverless Pod was gone as well as the $66 I had in my balance.
18 replies