Herai_Studios
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
so how does it look when it works for you?
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
nice! I'll add that if you have CUDA 11.8, the reason vllm won't work is because you have to make sure you have the right PyTorch version that uses CUDA 11.8
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
@ashleyk if I'm not mistaken, @wizardjoe mentioned his machine has a GPU and mine also does. The docker image is not working correctly for either of us though and it's breaking at the same point
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
or you can do RUN pip install vllm
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
I built my own where I just added vllm to the requirements.txt file for download and that worked better
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
just to be clear - it doesn't let you get past the setup.py script for vllm, correct? this is where it breaks for me with the pre-built dockerfile as well
69 replies
RRunPod
•Created by wizardjoe on 1/4/2024 in #⚡|serverless
Error building worker-vllm docker image for mixtral 8x7b
@Alpay Ariyak @Justin If you guys have any update on this, I would be interested in knowing the outcome as I am facing this issue as well
69 replies