Serverless vLLM deployment stuck at "Initializing" with no logs

I've been trying for hours, initially I was trying to deploy Ollama on Serverless GPU, not working, stuck at initializing. Now I am directly using the Serverless vLLM option and it is still not working. Every time I click the deploy button, it just says "Initializing" and there's nothing more, no logs whatsoever. Any idea? Thanks!
No description
4 Replies
wiki
wiki4mo ago
What GPU did you use? It could be the case, where the selected GPU is not available.
yhlong00000
yhlong000004mo ago
or you don't have access to pull the image or model.
marticztn
marticztnOP4mo ago
I'm using the 24GB GPU, looks like it is unavailable?
No description
marticztn
marticztnOP4mo ago
Ok, after switching to the available one (80GB GPU) it's now all good. Thanks for the help
Want results from more Discord servers?
Add your server