Jason
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?

25 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?

25 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?
you need access from hf + hf token to access it
25 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?
just now i've tried for you!
25 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?
Yes it works
25 replies
RRunPod
•Created by Eren on 3/27/2025 in #⚡|serverless
Meaning of -u1 -u2 at the end of request id?
have you got an answer to this maybe from a support ticket?
3 replies
RRunPod
•Created by Realised Prophets on 3/15/2025 in #⚡|serverless
Faster-Whisper output "None" — log 400 "Bed request"

11 replies
RRunPod
•Created by wrichert on 3/30/2025 in #⚡|serverless
Hi, I'm new to runpod and try to debug this error
if you try, using the nomal image that is attached in the readme.md (the one they've buildd) is it working?
6 replies
RRunPod
•Created by wrichert on 3/30/2025 in #⚡|serverless
Hi, I'm new to runpod and try to debug this error
or there is some problems with the whisper repo's / its deps
6 replies
RRunPod
•Created by wrichert on 3/30/2025 in #⚡|serverless
Hi, I'm new to runpod and try to debug this error
maybe, you could try it
6 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?
already
25 replies
RRunPod
•Created by Yebs on 3/17/2025 in #⚡|serverless
Can you now run gemma 3 in the vllm container?
yes i think vllm is updated
25 replies
RRunPod
•Created by ErezL on 3/30/2025 in #⚡|serverless
Length of output of serverless meta-llama/Llama-3.1-8B-Instruct
https://github.com/runpod-workers/worker-vllm/?tab=readme-ov-file#sampling-parameters
need to expand these 2 for reference
6 replies
RRunPod
•Created by ErezL on 3/30/2025 in #⚡|serverless
Length of output of serverless meta-llama/Llama-3.1-8B-Instruct
6 replies
RRunPod
•Created by ErezL on 3/30/2025 in #⚡|serverless
Length of output of serverless meta-llama/Llama-3.1-8B-Instruct
if you don't want, change your request to this
6 replies