Serverless Hardware equivalent of endpoint
Hi guys I'm currently migrating from Faster Whisper endpoint to Serverless. What configuration would give me similar inference speed to the Faster Whisper endpoint? Also what cost difference should I expect?
2 Replies
Try the recommended gpu
Probably the same, just using your own bucket for storage that's the differencr
Thanks it seems fine this way