R
RunPod•4mo ago
nerdylive

something went wrong *X when creating serverless vllm

see screenshot for more context, this happened when i click the deploy in vllm-worker setup in runpod quick deploy
No description
6 Replies
yhlong00000
yhlong00000•4mo ago
any extra information to help me to replicate this issue?
nerdylive
nerdyliveOP•4mo ago
I don't remember, maybe runpod should just have user request logged, or if some front-end error occurred they should log it rather than asking for reproducible inputs or steps Use some thing like sentry Sentry.io or smth
yhlong00000
yhlong00000•4mo ago
we have user session data, but there are tons of them😂 , need a bit more info to narrow
nerdylive
nerdyliveOP•4mo ago
Well from that timestamp it happened at my time WIB 2:49 i was creating on serverless, vllm quick deploy and used a token from hf, model is neuralmagic/Meta-Llama-3.1-70B-Instruct-FP8 that should narrow down my account's information by alot.. please ask if you know what informations should i provide
yhlong00000
yhlong00000•4mo ago
Thanks for the info, this is help, btw, would you mind to dm me your email?
nerdylive
nerdyliveOP•4mo ago
Yeah sure
Want results from more Discord servers?
Add your server