Aerotune
RRunPod
•Created by Aerotune on 8/1/2024 in #⚡|serverless
Inquiry on Utilizing TensorFlow Serving with GPU in Serverless Configuration
Hello Runpod Community,
I'm exploring options to utilize TensorFlow Serving with GPU support in a serverless configuration on Runpod. Specifically, I'm interested in whether it's feasible to make requests from a Runpod serverless job to a TensorFlow Serving instance running on the same container or environment.
Could anyone clarify if this setup is supported? Additionally, are there alternative recommended approaches for deploying TensorFlow Serving with GPU on Runpod's serverless infrastructure?
Thank you in advance for your assistance!
Best regards,
Sebastian
9 replies