TumbleWeed
RRunPod
•Created by TumbleWeed on 2/16/2024 in #⚡|serverless
Run LLM Model on Runpod Serverless
Hi There,
I have LLM Model which build on docker image and it was 40GB++ docker Image.
I'm wondering, can I mount the model as volume instead of add the model in the docker image?
Thanks !
50 replies