something went wrong *X when creating serverless vllm
see screenshot for more context, this happened when i click the deploy in vllm-worker setup in runpod quick deploy
6 Replies
any extra information to help me to replicate this issue?
I don't remember, maybe runpod should just have user request logged, or if some front-end error occurred they should log it rather than asking for reproducible inputs or steps
Use some thing like sentry
Sentry.io or smth
we have user session data, but there are tons of them😂 , need a bit more info to narrow
Well from that timestamp it happened at my time WIB 2:49
i was creating on serverless, vllm quick deploy
and used a token from hf, model is neuralmagic/Meta-Llama-3.1-70B-Instruct-FP8
that should narrow down my account's information by alot.. please ask if you know what informations should i provide
Thanks for the info, this is help, btw, would you mind to dm me your email?
Yeah sure