Casper.
Casper.
RRunPod
Created by Casper. on 7/23/2024 in #⛅|pods
Updated Torch templates
torch 2.4 is out now. would be nice to have version specific soonish if you want to do this 🙂
10 replies
RRunPod
Created by Casper. on 7/23/2024 in #⛅|pods
Updated Torch templates
I used it today and it works well. What do you think about making specific versions of this? Like we are getting torch 2.4 soon, but moving to the new version immediately is not desirable
10 replies
RRunPod
Created by Casper. on 7/23/2024 in #⛅|pods
Updated Torch templates
This template is not cached on the machine, unfortunately. That's one of the reasons I like the "official" one's . But the official one's are also version specific which is handy during development
10 replies
RRunPod
Created by Casper. on 7/23/2024 in #⛅|pods
Updated Torch templates
@Justin Merrell would you happen to know who is updating templates?
10 replies
RRunPod
Created by Casper. on 6/21/2024 in #⛅|pods
PyTorch 2.3: Lacking image on RunPod
Then a 2.3.0 release would be nice
9 replies
RRunPod
Created by Casper. on 6/12/2024 in #⚡|serverless
update worker-vllm to vllm 0.5.0
Nice to hear it's already in progress! Let me know when it's ready and I would love to test it out
4 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Build worked @Alpay Ariyak, thanks for fixing it
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Rebuilding now, let's see
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Just for reference
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
I am building on a Macbook M2 btw
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
I'm on the latest commit d91ccb866fc784b81a558f0da44041a020ba54e0
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Yeah it loads the new 0.3.1 [vllm-base 1/7] FROM docker.io/runpod/worker-vllm:base-0.3.1-cuda11.8.0
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Same as I pasted above
> [vllm-base 6/7] RUN --mount=type=secret,id=HF_TOKEN,required=false if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "PatentPilotAI/mistral-7b-patent-instruct-v2" ]; then python3 /download_model.py; fi:
#9 10.13 Traceback (most recent call last):
#9 10.13 File "/download_model.py", line 4, in <module>
#9 10.13 from vllm.model_executor.weight_utils import prepare_hf_model_weights, Disabledtqdm
#9 10.13 File "/vllm-installation/vllm/model_executor/__init__.py", line 2, in <module>
#9 10.13 from vllm.model_executor.model_loader import get_model
#9 10.13 File "/vllm-installation/vllm/model_executor/model_loader.py", line 10, in <module>
#9 10.13 from vllm.model_executor.weight_utils import (get_quant_config,
#9 10.13 File "/vllm-installation/vllm/model_executor/weight_utils.py", line 18, in <module>
#9 10.13 from vllm.model_executor.layers.quantization import QuantizationConfig
#9 10.13 File "/vllm-installation/vllm/model_executor/layers/quantization/__init__.py", line 4, in <module>
#9 10.13 from vllm.model_executor.layers.quantization.awq import AWQConfig
#9 10.13 File "/vllm-installation/vllm/model_executor/layers/quantization/awq.py", line 6, in <module>
#9 10.13 from vllm._C import ops
#9 10.14 ImportError: libcuda.so.1: cannot open shared object file: No such file or directory
------
executor failed running [/bin/sh -c if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "$MODEL_NAME" ]; then python3 /download_model.py; fi]: exit code: 1
> [vllm-base 6/7] RUN --mount=type=secret,id=HF_TOKEN,required=false if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "PatentPilotAI/mistral-7b-patent-instruct-v2" ]; then python3 /download_model.py; fi:
#9 10.13 Traceback (most recent call last):
#9 10.13 File "/download_model.py", line 4, in <module>
#9 10.13 from vllm.model_executor.weight_utils import prepare_hf_model_weights, Disabledtqdm
#9 10.13 File "/vllm-installation/vllm/model_executor/__init__.py", line 2, in <module>
#9 10.13 from vllm.model_executor.model_loader import get_model
#9 10.13 File "/vllm-installation/vllm/model_executor/model_loader.py", line 10, in <module>
#9 10.13 from vllm.model_executor.weight_utils import (get_quant_config,
#9 10.13 File "/vllm-installation/vllm/model_executor/weight_utils.py", line 18, in <module>
#9 10.13 from vllm.model_executor.layers.quantization import QuantizationConfig
#9 10.13 File "/vllm-installation/vllm/model_executor/layers/quantization/__init__.py", line 4, in <module>
#9 10.13 from vllm.model_executor.layers.quantization.awq import AWQConfig
#9 10.13 File "/vllm-installation/vllm/model_executor/layers/quantization/awq.py", line 6, in <module>
#9 10.13 from vllm._C import ops
#9 10.14 ImportError: libcuda.so.1: cannot open shared object file: No such file or directory
------
executor failed running [/bin/sh -c if [ -f /run/secrets/HF_TOKEN ]; then export HF_TOKEN=$(cat /run/secrets/HF_TOKEN); fi && if [ -n "$MODEL_NAME" ]; then python3 /download_model.py; fi]: exit code: 1
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Getting same error ImportError: libcuda.so.1: cannot open shared object file: No such file or directory
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
I’ll try it later
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
My model is not quantized
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
bfeb60c54eaad2eeffa9741ce7600eb30e573698
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Just pushing my new serverless image now. Would love for this to be fixed so that I can upgrade
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
Actually, I had to go even further back to get it working 😅
30 replies
RRunPod
Created by Casper. on 2/28/2024 in #⚡|serverless
worker-vllm build fails
I checked out commit 2b5b8dfb61e32d221bc8ce49f98ec74698154a6e to get it working for now. Seems latest release is broken somehow
30 replies