justin
RRunPod
•Created by TheWitcher.... on 10/26/2024 in #⚡|serverless
How to send an image as a prompt to vLLM?
So before sending it in json encode as base64, then decode on the other end.
as long the image is not too big
5 replies
RRunPod
•Created by TheWitcher.... on 10/26/2024 in #⚡|serverless
How to send an image as a prompt to vLLM?
Need to base64 encode + decode it
5 replies
How to override ollama/ollama image to run a model at startup
2) Uh maybe a bash script where it starts the server in the background on startup and starts the download of the model, or change the path of where it checks to ur network volume.
6 replies
How to send request to a pod ?
Okay step by step manually at least:
I create a runpod template on:
runpod/pytorch:2.1.0-py3.10-cuda11.8.0-devel-ubuntu22.04
Create a virtualEnv:
python3 -m venv venv
source venv/bin/activate
pip install flask
Run the flask app:
python main.py
29 replies