fireice
RRunPod
•Created by fireice on 11/14/2024 in #⚡|serverless
How to Get the Progress of the Processing job in serverless ?
with self.progress_bar(total=num_inference_steps) as progress_bar:
for i, t in enumerate(timesteps): runpod.serverless.progress_update(job, f"Finished step {i + 1} / {len(timesteps)}") . Maybe I should put it here, at the end of each step execution?
10 replies
RRunPod
•Created by fireice on 11/14/2024 in #⚡|serverless
How to Get the Progress of the Processing job in serverless ?
My requirement is to generate only one photo each time. For the progress updates, I need the system to send a progress update after each step during the generation of a single photo. If generating one photo takes 30 steps, I expect an update after each step so that the client can display the progress as N / 30.
10 replies
RRunPod
•Created by fireice on 11/14/2024 in #⚡|serverless
How to Get the Progress of the Processing job in serverless ?
Can you supply a real project code ? Document is so simple, I can't follow it. In real project,I don't know how long each step will take. Code like " for update_number in range(0, 3):
runpod.serverless.progress_update(job, f"Update {update_number}/3") " will not work .
10 replies
RRunPod
•Created by fireice on 11/14/2024 in #⚡|serverless
How to Get the Progress of the Processing job in serverless ?
Thank you
10 replies
RRunPod
•Created by fireice on 7/23/2024 in #⚡|serverless
Why "CUDA out of memory" Today ? Same image to generate portrait, yesterday is ok , today in not.
OK, I see, I will test.
47 replies
RRunPod
•Created by fireice on 7/23/2024 in #⚡|serverless
Why "CUDA out of memory" Today ? Same image to generate portrait, yesterday is ok , today in not.
I am the developer. When I use my ai app, I get CUDA out of memory. I did nothing to the app.
47 replies
RRunPod
•Created by fireice on 7/4/2024 in #⚡|serverless
Can I select the GPU type based on the base model in python script ?
I see, thanks
9 replies
RRunPod
•Created by fireice on 7/4/2024 in #⚡|serverless
Can I select the GPU type based on the base model in python script ?
9 replies
Can I use torch2.3.0 + cuda 11.8 on Runpod?
Traceback (most recent call last):
2024-06-05T09:04:11.388043478Z File "/runpod-volume/55fd91b5/prod/instantid/src/handler.py", line 14, in <module>
2024-06-05T09:04:11.388417474Z import diffusers
2024-06-05T09:04:11.388430521Z ModuleNotFoundError: No module named 'diffusers'.
I confirm that diffusers 0.27.0 is already installed. Why does this issue always occur in serverless?
23 replies
RRunPod
•Created by fireice on 5/22/2024 in #⚡|serverless
timeout in javascript sdk not work
I see, in index.ts, import { curry, clamp, isNil } from "ramda",but I did not install ramda before. So run = curry () did not work.Now I works.
16 replies
RRunPod
•Created by fireice on 5/22/2024 in #⚡|serverless
timeout in javascript sdk not work
16 replies
RRunPod
•Created by fireice on 5/22/2024 in #⚡|serverless
timeout in javascript sdk not work
just now,I set default number 3000 to 300000, not work,still expeed 3000
16 replies