How are the cold starts?

How are prisma coldstarts on cloud functions/serverless environments, like Lambdas?
1 Reply
Prisma AI Help
You selected the bot's instantaneous insights. Let me compile your answer right away! Prisma has made significant improvements to address cold start issues in serverless environments like AWS Lambda. Here's a summary of the current state: 1. Improved Performance: Prisma has reduced cold start times significantly. According to their blog post, they've sped up serverless cold starts by 9x. 2. JSON-based Wire Protocol: Prisma introduced a more efficient JSON-based wire protocol that replaced the previous GraphQL-based protocol. This change has greatly improved startup performance in serverless environments. 3. Reduced Bundle Size: The Prisma team has reduced the bundle size of Prisma ORM to less than 1MB, making it suitable for use in free plans of major Edge function providers. 4. Comparable Performance: Prisma's startup performance is now on par with or faster than most competing ORMs when deployed to serverless functions. 5. Ongoing Improvements: The Prisma team is continuing to work on further improvements, including a move from Rust to TypeScript for certain components, which should further streamline deployment in serverless environments. It's worth noting that while Prisma itself no longer adds significant overhead to cold starts, establishing a database connection still contributes to overall cold start time. This is a general challenge in serverless environments and not specific to Prisma. For the most up-to-date and detailed information on Prisma's performance in serverless environments, you can refer to their blog post: How We Sped Up Serverless Cold Starts with Prisma by 9x. ✅ Great, thanks for letting us know that this solved your question!

Did you find this page helpful?