R
RunPod10mo ago
JJonahJ

error downloading model? TheBloke/Mixtral-8x7B-MoE-RP-Story-AWQ

2335.9 Traceback (most recent call last): 2335.9 File "/download_model.py", line 48, in <module> 2335.9 tokenizer_folder = download_extras_or_tokenizer(tokenizer, download_dir, revisions["tokenizer"]) 2335.9 File "/download_model.py", line 10, in download_extras_or_tokenizer 2335.9 folder = snapshot_download( 2335.9 File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn 2335.9 validate_repo_id(arg_value) 2335.9 File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 164, in validate_repo_id 2335.9 raise HFValidationError( 2335.9 huggingface_hub.utils.validators.HFValidationError: Repo id must use alphanumeric chars or '-', '', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: ''. ------ Dockerfile:35 -------------------- 34 | COPY builder/download_model.py /download_model.py 35 | >>> RUN --mount=type=secret,id=HF_TOKEN,required=false \ 36 | >>> if [ -f /run/secrets/HF_TOKEN ]; then \ 37 | >>> export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); \ 38 | >>> fi && \ 39 | >>> if [ -n "$MODEL_NAME" ]; then \ 40 | >>> python3 /download_model.py; \ 41 | >>> fi 42 | -------------------- ERROR: failed to solve: process "/bin/sh -c if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "$MODEL_NAME" ]; then python3 /download_model.py; fi" did not complete successfully: exit code: 1
13 Replies
JJonahJ
JJonahJOP10mo ago
aha I see it's already mentioned on github. https://github.com/runpod-workers/worker-vllm/issues/42
GitHub
Error after tokenizer commit · Issue #42 · runpod-workers/worker-vl...
2b5b8df after this commit i can't build my image docker build -t instructkr/qwen:1.5_72b_chat --build-arg MODEL_NAME="Qwen/Qwen1.5-72B-Chat-AWQ" --build-arg QUANTIZATION="awq&quo...
ashleyk
ashleyk10mo ago
Are you having this issue with vllm? Your original message didn't specify what you were using, you just said you weren't able to download the model. You should specify if you're using vllm etc.
JJonahJ
JJonahJOP10mo ago
It’s in my second post 👀
ashleyk
ashleyk10mo ago
You just say its mentioned in the vllm github repo, you didn't mention anywhere that you had an issue with vllm.
JJonahJ
JJonahJOP10mo ago
Please stop typing!
ashleyk
ashleyk10mo ago
People can't read minds unfortunately, so you need to be specific.
JJonahJ
JJonahJOP10mo ago
It’s the exact same issue please stop! 😭
ashleyk
ashleyk10mo ago
I will type us much as I want so shut up It pisses me off when people like you screw up and don't provide details and then want to get aggressive towards others for your own fault
JJonahJ
JJonahJOP10mo ago
Leave me alone!
ashleyk
ashleyk10mo ago
Gladly 🖕
Alpay Ariyak
Alpay Ariyak10mo ago
I’ll take care of it today Forgot to send an update, but it was fixed on Friday
JJonahJ
JJonahJOP10mo ago
Thanks, I noticed you did something, but huggingface was down when I went to try it 😅
Want results from more Discord servers?
Add your server