error downloading model? TheBloke/Mixtral-8x7B-MoE-RP-Story-AWQ
2335.9 Traceback (most recent call last):
2335.9 File "/download_model.py", line 48, in <module>
2335.9 tokenizer_folder = download_extras_or_tokenizer(tokenizer, download_dir, revisions["tokenizer"])
2335.9 File "/download_model.py", line 10, in download_extras_or_tokenizer
2335.9 folder = snapshot_download(
2335.9 File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
2335.9 validate_repo_id(arg_value)
2335.9 File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 164, in validate_repo_id
2335.9 raise HFValidationError(
2335.9 huggingface_hub.utils.validators.HFValidationError: Repo id must use alphanumeric chars or '-',
'', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: ''.
------
Dockerfile:35
--------------------
34 | COPY builder/download_model.py /download_model.py
35 | >>> RUN --mount=type=secret,id=HF_TOKEN,required=false \
36 | >>> if [ -f /run/secrets/HF_TOKEN ]; then \
37 | >>> export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); \
38 | >>> fi && \
39 | >>> if [ -n "$MODEL_NAME" ]; then \
40 | >>> python3 /download_model.py; \
41 | >>> fi
42 |
--------------------
ERROR: failed to solve: process "/bin/sh -c if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "$MODEL_NAME" ]; then python3 /download_model.py; fi" did not complete successfully: exit code: 1
13 Replies
aha I see it's already mentioned on github. https://github.com/runpod-workers/worker-vllm/issues/42
GitHub
Error after tokenizer commit · Issue #42 · runpod-workers/worker-vl...
2b5b8df after this commit i can't build my image docker build -t instructkr/qwen:1.5_72b_chat --build-arg MODEL_NAME="Qwen/Qwen1.5-72B-Chat-AWQ" --build-arg QUANTIZATION="awq&quo...
Are you having this issue with vllm? Your original message didn't specify what you were using, you just said you weren't able to download the model. You should specify if you're using vllm etc.
It’s in my second post 👀
You just say its mentioned in the vllm github repo, you didn't mention anywhere that you had an issue with vllm.
Please stop typing!
People can't read minds unfortunately, so you need to be specific.
It’s the exact same issue please stop! 😭
I will type us much as I want so shut up
It pisses me off when people like you screw up and don't provide details and then want to get aggressive towards others for your own fault
NOKO
SoundCloud
San Holo - Lift Me From The Ground (NOKO Remix)
welcome.
w̶e̶ ̶a̶r̶e̶ ̶n̶o̶k̶o̶
personal: [email protected]
inquiries: [email protected] // [email protected]
Leave me alone!
Gladly 🖕
I’ll take care of it today
Forgot to send an update, but it was fixed on Friday
Thanks, I noticed you did something, but huggingface was down when I went to try it 😅