R
RunPod•7mo ago
MattArgentina

vLLM Serverless error

When using the vLLM Serverless template I get the following error when trying to use the model - cognitivecomputations/dolphin-2.9-llama3-8b: HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name
Solution:
Its fixed, it was due to the thing you just posted in #🚨|incidents @haris
Jump to solution
7 Replies
digigoblin
digigoblin•7mo ago
Not sure whether your issue is related to this: https://github.com/runpod-workers/worker-vllm/issues/75
GitHub
Incorrect path_or_model_id · Issue #75 · runpod-workers/worker-vllm
Hi! In the last few hours I'm getting this error while pulling any image from hugging face: OSError: Incorrect path_or_model_id: ''. Please provide either the path to a local folder or ...
digigoblin
digigoblin•7mo ago
@MattArgentina what did you set your MODEL_NAME environment variable to?
MattArgentina
MattArgentinaOP•7mo ago
Yeah it's the same error. MODEL_NAME set to cognitivecomputations/dolphin-2.9-llama3-8b. It was working yesterday
haris
haris•7mo ago
cc: @Alpay Ariyak
Solution
digigoblin
digigoblin•7mo ago
Its fixed, it was due to the thing you just posted in #🚨|incidents @haris
digigoblin
digigoblin•7mo ago
You have to scale workers down to zero and back up again so that the environment variables can be correctly applied though.
haris
haris•7mo ago
Got it, will note that in the incident announcement as well once i’m back from lunch
Want results from more Discord servers?
Add your server