R
RunPod13mo ago
reavatar

Download Stuck - Worker VLLM

The download get stuck on some value https://github.com/runpod-workers/worker-vllm
No description
19 Replies
reavatar
reavatarOP13mo ago
This has started happening with new worker vllm changes and wasn't happening earlier
Justin Merrell
Justin Merrell13mo ago
What is the full docker build command you are using? @propback
reavatar
reavatarOP13mo ago
sudo docker build -t username/image:tag --build-arg MODEL_NAME="openchat/openchat_3.5" --build-arg MODEL_BASE_PATH="/models" . With username/image:tag ofcourse my own This is same command listed on Readme docs
Alpay Ariyak
Alpay Ariyak13mo ago
Hey, how long did you keep it running?
reavatar
reavatarOP13mo ago
40 minutes I previously had cloned till below commit and was working. https://github.com/runpod-workers/worker-vllm/tree/24feadd903cf5528b3e5ec8ce400fd6a184ecc04 Now, when using this commit
# Quick temporary updates
RUN pip install git+https://github.com/runpod/runpod-python@a1#egg=runpod --compile
# Quick temporary updates
RUN pip install git+https://github.com/runpod/runpod-python@a1#egg=runpod --compile
is causing trouble. If this could be worked out, I'd atleast having something working for now
Alpay Ariyak
Alpay Ariyak13mo ago
Could you try now please? Fixed in latest version. The only thing you can't do atm is build from a machine without GPUs
reavatar
reavatarOP13mo ago
It crash my 32GB RAM while running setup.py vllm
reavatar
reavatarOP13mo ago
Also tried using runpod/worker-vllm:dev directly but while running container on runpod
No description
reavatar
reavatarOP13mo ago
Also, thought to build docker image on runpod gpu, but docker wasn't starting there
Alpay Ariyak
Alpay Ariyak13mo ago
@merrell.io I think you need to update the GitHub action for pushing the image, it still hasn’t from my last commit which was 4h ago RunPod machines are inside docker containers, so you can’t run docker inside You can use this in the meantime alpayariyak/vllm:11.8 @reavatar
reavatar
reavatarOP13mo ago
error pulling image: Error response from daemon: manifest for alpayariyak/vllm:11.8 not found: manifest unknown: manifest unknown
Alpay Ariyak
Alpay Ariyak13mo ago
Try again
reavatar
reavatarOP13mo ago
ok, downloading now worked. Thanks
Alpay Ariyak
Alpay Ariyak13mo ago
No problem!
reavatar
reavatarOP13mo ago
By the way, this doesn't return logprobs even when passed in sampling params
Alpay Ariyak
Alpay Ariyak13mo ago
Yeah, it's not supported atm
reavatar
reavatarOP13mo ago
Any ETA?
Alpay Ariyak
Alpay Ariyak13mo ago
Anything between 1 and 3 weeks really, it's a bit in the backlog, as there's very little demand for it. I'm curious, what do you need them for?
reavatar
reavatarOP13mo ago
Using it create labels eg. True/False and want to pick highest prob labels and also have a kind of system where I need to verify if a particular response was generated from a model or not. I think I'd just switch to Modal for now.

Did you find this page helpful?