Serverless Hardware equivalent of endpoint

Hi guys I'm currently migrating from Faster Whisper endpoint to Serverless. What configuration would give me similar inference speed to the Faster Whisper endpoint? Also what cost difference should I expect?
2 Replies
nerdylive
nerdylive5mo ago
Try the recommended gpu Probably the same, just using your own bucket for storage that's the differencr
slavov.tech | vidfast.ai
Thanks it seems fine this way
Want results from more Discord servers?
Add your server