Serverless vLLM deployment stuck at "Initializing" with no logs
I've been trying for hours, initially I was trying to deploy Ollama on Serverless GPU, not working, stuck at initializing. Now I am directly using the Serverless vLLM option and it is still not working. Every time I click the deploy button, it just says "Initializing" and there's nothing more, no logs whatsoever. Any idea? Thanks!
4 Replies
What GPU did you use? It could be the case, where the selected GPU is not available.
or you don't have access to pull the image or model.
I'm using the 24GB GPU, looks like it is unavailable?
Ok, after switching to the available one (80GB GPU) it's now all good. Thanks for the help