vLLM Serverless error
When using the vLLM Serverless template I get the following error when trying to use the model - cognitivecomputations/dolphin-2.9-llama3-8b:
HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name
Solution:Jump to solution
Its fixed, it was due to the thing you just posted in #🚨|incidents @haris
7 Replies
Not sure whether your issue is related to this:
https://github.com/runpod-workers/worker-vllm/issues/75
GitHub
Incorrect path_or_model_id · Issue #75 · runpod-workers/worker-vllm
Hi! In the last few hours I'm getting this error while pulling any image from hugging face: OSError: Incorrect path_or_model_id: ''. Please provide either the path to a local folder or ...
@MattArgentina what did you set your
MODEL_NAME
environment variable to?Yeah it's the same error. MODEL_NAME set to cognitivecomputations/dolphin-2.9-llama3-8b. It was working yesterday
cc: @Alpay Ariyak
Solution
Its fixed, it was due to the thing you just posted in #🚨|incidents @haris
You have to scale workers down to zero and back up again so that the environment variables can be correctly applied though.
Got it, will note that in the incident announcement as well once i’m back from lunch