Aerotune
RRunPod
•Created by Aerotune on 8/1/2024 in #⚡|serverless
Inquiry on Utilizing TensorFlow Serving with GPU in Serverless Configuration
Thank you all for the responses and suggestions! I currently have a local setup with a Runpod, TensorFlow and CUDA container that works well but is quite large (~7 GB). I'm also considering using TensorFlow Serving, which could reduce the image size to less than 1 GB. I'll test both approaches and share my findings once I have more details. This might take some time, but I'll keep you posted!
Cheers, Sebastian
9 replies