J*
J*
RRunPod
Created by J* on 5/23/2024 in #⚡|serverless
GPU for 13B language model
Thanks, will give it a try!
10 replies
RRunPod
Created by J* on 5/23/2024 in #⚡|serverless
GPU for 13B language model
10 replies
RRunPod
Created by J* on 5/23/2024 in #⚡|serverless
GPU for 13B language model
Well I tried and failed, out of memory CUDA exception. I guess I'll stick to 48 GB GPU for now.
10 replies