Zeen
RRunPod
•Created by Satpal on 5/8/2024 in #⚡|serverless
Serverless Error Kept Pod Active
Not currently since the code is closed source for platform. The blog usually gets product updates every week or so, so that wouldn't be a bad place to start
51 replies
RRunPod
•Created by Satpal on 5/8/2024 in #⚡|serverless
Serverless Error Kept Pod Active
We'll figure out if we can enable something like this
51 replies
RRunPod
•Created by Mandragora.ai on 5/10/2024 in #⚡|serverless
Serverless broke for me overnight, I can't get inference to run at all.
no rush on that ofc
101 replies
RRunPod
•Created by Mandragora.ai on 5/10/2024 in #⚡|serverless
Serverless broke for me overnight, I can't get inference to run at all.
can you DM me your runpod email? We'll figure out some comp for this - really sorry for the issues this caused
101 replies
RRunPod
•Created by Mandragora.ai on 5/10/2024 in #⚡|serverless
Serverless broke for me overnight, I can't get inference to run at all.
it should be live in less than 30m
101 replies
RRunPod
•Created by Scott 🌱 on 5/10/2024 in #⚡|serverless
Please focus on usability
Initial ui was pretty mobile friendly, but we haven't done a great job in keeping up. Some of these issues extend to desktop too, though
18 replies
RRunPod
•Created by Scott 🌱 on 5/10/2024 in #⚡|serverless
Please focus on usability
Thanks for the feedback, we're on it!
18 replies
RRunPod
•Created by Scott 🌱 on 5/10/2024 in #⚡|serverless
Please focus on usability
Haha we might just be interested
18 replies
RRunPod
•Created by Jidovenok on 2/21/2024 in #⚡|serverless
All 27 workers throttled
Hey, I know not much I can say after the fact can fix past pain, but we have made a few platform releases to improve the throttling in the past day as well as added more capacity (way more coming next week). We've got a lot of customer using serverless and we've experience a spike in consumption usage that is just enormous and we're trying our best to handle it. We apologize for affecting your business and we are trying our best to find a balance between action and messaging.
239 replies
RRunPod
•Created by Jidovenok on 2/21/2024 in #⚡|serverless
All 27 workers throttled
No we had an internal discussion and all agreed that the quota shouldn't have been increased in this case.
239 replies
RRunPod
•Created by Jidovenok on 2/21/2024 in #⚡|serverless
All 27 workers throttled
what happened in the past few days is that a few of our larger customers flexed up 600+ serverless workers
239 replies
RRunPod
•Created by Jidovenok on 2/21/2024 in #⚡|serverless
All 27 workers throttled
we're thinking to just allow users to set a quota per gpu type in addition to assigning launch priority
239 replies
RRunPod
•Created by Jidovenok on 2/21/2024 in #⚡|serverless
All 27 workers throttled
it's not a bug as much as priority algo isn't good
239 replies