randhash
randhash
RRunPod
Created by randhash on 4/8/2024 in #⚡|serverless
Maximum size of single output for streaming handlers
We are currently trying to refactor our RunPod handlers to work asynchronously with streaming of results. Unfortunately, we ran into this error when trying to yield result images:
2024-04-08T09:34:46.608281091Z {"requestId": "75dd8d62-adde-402a-902b-bbef06d90064-e1", "message": "Failed to return job results. | 400, message='Bad Request', url=URL('https://api.runpod.ai/v2/s6d4fprlj0v7k5/job-stream/9m2ossyhvxlp9a/75dd8d62-adde-402a-902b-bbef06d90064-e1?gpu=NVIDIA+L4&isStream=false')", "level": "ERROR"}
2024-04-08T09:34:46.608281091Z {"requestId": "75dd8d62-adde-402a-902b-bbef06d90064-e1", "message": "Failed to return job results. | 400, message='Bad Request', url=URL('https://api.runpod.ai/v2/s6d4fprlj0v7k5/job-stream/9m2ossyhvxlp9a/75dd8d62-adde-402a-902b-bbef06d90064-e1?gpu=NVIDIA+L4&isStream=false')", "level": "ERROR"}
After some trial and error, it seems to be related to the payload size with the limit set somewhere around 100 KB, although this doesn't seem to be documented anywhere. Since we want to stream the outputs of an image generation model, this constraint would pretty much destroy our efforts and nullify the use for generative image models. What is the actual payload size limit and is there a workaround for streaming images, possibly larger than 1 MB?
3 replies