riceboy26
RRunPod
•Created by BBAzn on 10/29/2024 in #⚡|serverless
just got hit with huge serverless bill
I also raised a ticket and provided a bunch of info but was basically brushed away
19 replies
RRunPod
•Created by BBAzn on 10/29/2024 in #⚡|serverless
just got hit with huge serverless bill
Same over the weekend. I’ve learned to only fill clips only $25 and I’m hesitant to put more
I occasionally check runpod every few hours now to make sure there aren’t any hanging requests
19 replies
RRunPod
•Created by deepblhe on 10/14/2024 in #⚡|serverless
Testing Endpoint in Local with Docker and GPU
for example, i have an environment parameter for the path of a json file with a bunch of configurations (like default parameters for cfg number, samplers, sampling steps, etc.) and i made the path point to a json file in a network volume
this way, if i want to change the json file, i just spin up the cheapest pod connected to that network volume to add or modify a json file
and the serverless endpoint environment variable can just point to that new file
8 replies
RRunPod
•Created by deepblhe on 10/14/2024 in #⚡|serverless
Testing Endpoint in Local with Docker and GPU
it also helps to parameterize as much as you can as environment variables so you dont have to build and deploy a new docker image every time you need something to change
8 replies
RRunPod
•Created by deepblhe on 10/14/2024 in #⚡|serverless
Testing Endpoint in Local with Docker and GPU
i normally just have the changes working locally (literally by having the gpu server running and running
python3 src/rp_handler.py --rp_serve_api --reload
) and using the /docs endpoint to trigger a run manually
then once it works, i build the docker image and run it locally to make sure it works
then once the docker image works, i push to to docker hub and update serverless/pod accordingly8 replies
RRunPod
•Created by DEOGEE on 10/25/2024 in #⚡|serverless
Any good tutorials out there on setting up an sd model from civitai on runpod serverless?
then you'll need to boot up the image gen server (comfyui or a111 or flux) so that your rp_handler can pass the parameters appropriately and generate the image for you
10 replies
RRunPod
•Created by DEOGEE on 10/25/2024 in #⚡|serverless
Any good tutorials out there on setting up an sd model from civitai on runpod serverless?
depending on the base model (sd1, sd2, sd 3.5, sdxl, flux, pony, etc.), you may have different requirements in your dockerfile to clone the proper repos
10 replies
RRunPod
•Created by DEOGEE on 10/25/2024 in #⚡|serverless
Any good tutorials out there on setting up an sd model from civitai on runpod serverless?
that blog is for pods which is slightly different from serverless
10 replies
RRunPod
•Created by DEOGEE on 10/25/2024 in #⚡|serverless
Any good tutorials out there on setting up an sd model from civitai on runpod serverless?
what's the base civitai model that you're using?
10 replies
RRunPod
•Created by RK on 10/29/2024 in #⚡|serverless
Can u run fastapi gpu project on serverless runpod?
you can follow https://github.com/runpod-workers/worker-a1111/blob/main/Dockerfile
but in the start.sh (https://github.com/runpod-workers/worker-a1111/blob/main/src/start.sh#L6), instead of starting
/stable-diffusion-webui/webui.py
, you run your fastapi command
you'll also neeed to change the Dockerfile to scrap all the a111 changes and replace with just the fastapi requirments.
under the hood, a111 is a fastapi server already4 replies
RRunPod
•Created by RK on 10/29/2024 in #⚡|serverless
Can u run fastapi gpu project on serverless runpod?
oh, youre asking for a guide on how to do it
4 replies
RRunPod
•Created by RK on 10/29/2024 in #⚡|serverless
Can u run fastapi gpu project on serverless runpod?
yes
4 replies
RRunPod
•Created by NERDDISCO on 8/9/2024 in #⚡|serverless
Slow network volume
i've found a way... after my pc docker engine wroke bc of some weird wsl issue that i still havent figured out....
i start a GCP VM with the deep learning linux image with a big boot disk and run docker build there. you get to take advantage of enterprise grade networking so builds are much bigger too
a 20gb docker image takes less than 5 minutes to push to dockerhub, whereas it would've taken 40 minutes on my residential toaster wifi
64 replies
RRunPod
•Created by riceboy26 on 9/25/2024 in #⚡|serverless
Sharing a pod template
Yea, that’s surprising…. It was created using the desktop web app
9 replies
RRunPod
•Created by riceboy26 on 9/25/2024 in #⚡|serverless
Sharing a pod template
But the template had to be created manually instead of cloning a serverless endpoint
9 replies
RRunPod
•Created by riceboy26 on 9/25/2024 in #⚡|serverless
Sharing a pod template
Are u sure? I was able to create a shareable serverless template
9 replies
RRunPod
•Created by riceboy26 on 9/25/2024 in #⚡|serverless
Sharing a pod template
NVM... there's an option to specify a servereless template instead of a pod template
9 replies
RRunPod
•Created by peteryoung2484 on 9/13/2024 in #⚡|serverless
Is there a way to speed up the reading of external disks(network volume)?
Are you pretty glued to docker build cloud or are u open to GCP artifact registry?
I use it for non-api / typical backend docker images and apparently the limit is 5TB https://cloud.google.com/artifact-registry/docs/docker/pushing-and-pulling
49 replies
RRunPod
•Created by peteryoung2484 on 9/13/2024 in #⚡|serverless
Is there a way to speed up the reading of external disks(network volume)?
Godly
49 replies