How to send an image as a prompt to vLLM?
Hi there,
I am new to Runpod and facing an issue in sending the image to the runpod serverless endpoint. In docs it's mentioned how an image could be received but I want to send it.
I am using a Qwen2VL Model which accepts an image and a text prompt. I am able to send text but not the image.
Please help me with this. Actually I am doing it for an assignment to be submitted before the deadline.
Thank you any help would be appreciated.
3 Replies
Need to base64 encode + decode it
So before sending it in json encode as base64, then decode on the other end.
as long the image is not too big
If your image has been coded for it you can send an image in using a URL to overcome the size issue.
How do you serve Qwen2VL on runpod serverless? Did you use vLLM template?