Mihály
Mihály
RRunPod
Created by Mihály on 10/17/2024 in #⚡|serverless
Worker keeps running after finishing job, burning money?
No description
5 replies
RRunPod
Created by Mihály on 9/18/2024 in #⚡|serverless
job timed out after 1 retries
No description
2 replies
RRunPod
Created by Mihály on 9/10/2024 in #⚡|serverless
Jobs in queue for a long time, even when there is a worker available
No description
12 replies
RRunPod
Created by Mihály on 8/15/2024 in #⚡|serverless
Is there any serverless template, or vLLM compatible HF repo for Vision models?
Hi! Are there any plug and play LLAVA serverless templates, or LLAMA 3 (or other) vision models that work with Runpod vLLM? I was using Ashleyk's awesome runpod-worker-llava, but it has been removed since.
2 replies
RRunPod
Created by Mihály on 3/26/2024 in #⚡|serverless
Failed to return job results.
Hi! I'm getting this error recenltly. I'm using runpod 1.6.2 {"requestId": "cce0d888-3d0c-4186-87cb-b94bfd359a71-e1", "message": "Failed to return job results. | 400, message='Bad Request', url=URL('https://api.runpod.ai/v2/8x6rjph4tvc8mp/job-done/hz8pr8qphlh72y/cce0d888-3d0c-4186-87cb-b94bfd359a71-e1?gpu=NVIDIA+GeForce+RTX+4090&isStream=false')", "level": "ERROR"} I'm not returning an error dict in the code. @ashleyk Could that be a sign that the result is over the size limit? Has the response size limit changed?
8 replies