Superintendent
RRunPod
•Created by Superintendent on 12/20/2024 in #⚡|serverless
https://github.com/runpod-workers/worker-stable_diffusion_v1
shit no idea why it was the na,e
4 replies
RRunPod
•Created by Superintendent on 12/20/2024 in #⚡|serverless
https://github.com/runpod-workers/worker-stable_diffusion_v1
4 replies
RRunPod
•Created by Superintendent on 2/16/2024 in #⚡|serverless
Deepseek coder on serverless
thanks
17 replies
RRunPod
•Created by Superintendent on 2/16/2024 in #⚡|serverless
Deepseek coder on serverless
neat
17 replies
RRunPod
•Created by Superintendent on 2/16/2024 in #⚡|serverless
Deepseek coder on serverless
oh really
17 replies
RRunPod
•Created by Concept on 1/15/2024 in #⚡|serverless
Rundpod VLLM Cuda out of Memory
not only do you need to have space for fp16 weights u need space for context, which should be about 2 to 3gb for 4k to 8k context
76 replies
RRunPod
•Created by Concept on 1/15/2024 in #⚡|serverless
Rundpod VLLM Cuda out of Memory
what context are u running at
76 replies