R
RunPod3mo ago
deepblhe

RunPods Serverless - Testing Endpoint in Local with Docker and GPU

I’m creating a custom container to run FLUX and Lora on Runpods, using this Stable Diffusion example as a starting point. I successfully deployed my first pod on Runpods, and everything worked fine. However, my issue arises when I make code changes and want to test my endpoints locally before redeploying. Constantly deploying to Runpods for every small test is quite time-consuming. I found a guide for local testing in the Runpods documentation here. Unfortunately, it only provides a simple example that suggests running the handler function directly, like this: python your_handler.py --test_input '{"input": {"prompt": "The quick brown fox jumps"}}' This does not work for me as it ignores the Docker setup entirely and runs the function in my local Python environment. I want to go beyond this and test the Docker image end-to-end locally—on my GPU—with the exact dependencies and setup used when deploying on Runpods. Is there specific documentation for testing Docker images locally for Runpods, or a recommended workflow for this kind of setup? I tried following the guidelines for local testing here: https://docs.runpod.io/serverless/workers/development/local-testing
Test locally | RunPod Documentation
Learn how to test your Handler Function locally using custom inputs and a local test server, simulating deployment scenarios without the need for cloud resources.
8 Replies
deepblhe
deepblheOP3mo ago
Stack Overflow
RunPods Serverless - Testing Endpoint in Local with Docker and GPU
I’m working on creating a custom container to run FLUX and Lora on Runpods, using this Stable Diffusion example as a starting point. I successfully deployed my first pod on Runpods, and everything ...
Madiator2011
Madiator20113mo ago
I usually just deploy pod and test there
tzk
tzk3mo ago
hi @Elder Papa Madiator, how do you deploy a serverless worker as a pod?
Madiator2011
Madiator20113mo ago
Just deploy pod and copy worker code and install all packages
tzk
tzk3mo ago
Can you clarify “worker code”? My serverless template is a custom docker image
deepblhe
deepblheOP3mo ago
Hi all! I have a solution for this in case you are looking
docker run --gpus all -p 8080:8080 -v "$(pwd)/test_input.json:/test_input.json" ${IMAGE_REPO}
docker run --gpus all -p 8080:8080 -v "$(pwd)/test_input.json:/test_input.json" ${IMAGE_REPO}
Have the test_input.json in your local folder and this will just start the endpoint, run the test and terminate
tzk
tzk3mo ago
THanks for sharing @flavio - just to clarify, what pod template did you use to be able to run the above command?
nerdylive
nerdylive3mo ago
A serverless worker template, using the runpod handler as entry point, with an argument passing the file mounted inside the container
Want results from more Discord servers?
Add your server