Marut
Marut
RRunPod
Created by marticztn on 9/18/2024 in #⚡|serverless
Serverless vLLM deployment stuck at "Initializing" with no logs
What GPU did you use? It could be the case, where the selected GPU is not available.
5 replies
RRunPod
Created by JHenriP on 2/26/2024 in #⚡|serverless
Severless 404
Hey, It works fine. I tested.
21 replies
RRunPod
Created by JHenriP on 2/26/2024 in #⚡|serverless
Severless 404
Let me check & try to reproduce!
21 replies
RRunPod
Created by JHenriP on 2/26/2024 in #⚡|serverless
Severless 404
Can you share error? Setup? It should be helpful.
21 replies
RRunPod
Created by JHenriP on 2/26/2024 in #⚡|serverless
Severless 404
@JHenriP Are you still facing the issue with the worker ?
21 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
@joshmohrer made minor changes to be in sync with upstream. You can try with multi language by specifying, now. Let me know if you face any issues.
14 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
I understand, Let me take a look on it.
14 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
I can check once.
14 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
Yeah, You can deploy this worker.
14 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
Is that docker image is built using this worker?
14 replies
RRunPod
Created by joshmohrer on 2/5/2024 in #⚡|serverless
Insanely Fast Whisper
hey @joshmohrer , Sure. How can I help? Are you facing any issues with this worker ?
14 replies