ErezL
ErezL
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
I works now. Thanks.
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
Thanks for your time.
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
I applied and waiting for approval...
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
I get an error that I need to ask for access to the model in huggingface
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
readonly token is ok?
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
How do I choose a GPU? Where do I even see which GPU I got?
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
no
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
should I be using a different template?
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
How can I choose a GPU? (there is no choice available in the setup process)
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
yes
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
it's the default (I deleted the instace by now)
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
No description
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
I did
40 replies
RRunPod
Created by ErezL on 3/30/2025 in #⚡|serverless
I am trying to deploy a "meta-llama/Llama-3.1-8B-Instruct" model on Serverless vLLM
I tried all of them I think. The storngest possible for sure.
40 replies