R
RunPod8mo ago
Emad

Cost me 25$ for a small request.

No description
No description
No description
No description
No description
18 Replies
nerdylive
nerdylive8mo ago
I don't get this, what's the problem? And also for logs please copy paste them or send it in a text file Please
Emad
EmadOP8mo ago
It charged for 5.3 hours When the request wasnt that long
nerdylive
nerdylive8mo ago
The billing shows that it was running for 5.3 hrs What makes that? I'd suggest using webchat or email support to resolve this out
Emad
EmadOP8mo ago
The program just ran into an error which can be seen from the logs How can it run for 5.3 hours
nerdylive
nerdylive8mo ago
i dont know, maybe something is running check your docker template or image what is running in main thread and there is some kind of timeout for workers, what do you set it to? usually an hour or less default
Emad
EmadOP8mo ago
Should I resolve this with the webchat?
nerdylive
nerdylive8mo ago
yes or by email they better understand about the usages and billing of your account too
Madiator2011
Madiator20118mo ago
From what I see your worker was throwing error and was looping and if not proper handled it will loop worker causing it to never stop. You probably want to check code of your worker
Emad
EmadOP8mo ago
And the server was on for 5 hours?
Madiator2011
Madiator20118mo ago
yes if you did not set it to end. I'm not sure how your worker is build
houmie
houmie8mo ago
I have noticed this too. If your LLM throws CUDA run out of memory, it will loop forever, unless you cancel the job manually. It's very dangerous on an expensive GPU. I wished there was a better way to handle that.
nerdylive
nerdylive8mo ago
Eh isn't there already a running timeout? You can set them in your endpoint settings
houmie
houmie8mo ago
Can you show us how? That would great. Thanks
nerdylive
nerdylive8mo ago
Go to your endpoint settings and take a screenshot Expand anything you can find And take a screen shot, let me see the options please Something called "execution timeout" I'm very sure
houmie
houmie8mo ago
Ahh yes. I found it. Mine isn't enabled.
No description
houmie
houmie8mo ago
60 seconds is reasonable, right?
digigoblin
digigoblin8mo ago
Depends on your specific endpoint.
nerdylive
nerdylive8mo ago
yep depends on what are you doing. try to estimate whats the normal time from your graphs maybe?
Want results from more Discord servers?
Add your server