Cold starts on Edge?

Hi, I have an API route that I made using Planetscale's database-js library. Here's the code:
import { Client } from "@planetscale/database";

export const runtime = 'edge' // 'nodejs' (default) | 'edge'

const db = new Client({
url: process.env["DATABASE_URL"],
});

export async function GET() {
const startTime = new Date().getTime(); // Start timestamp

const conn = db.connection();
const subjectList = await conn.execute("SELECT name from subjects"); // Specify the type as Subject[]

const rows = subjectList.rows; // Get the "rows" property

const json = JSON.stringify(rows);

const endTime = new Date().getTime(); // End timestamp
const durationInMs = endTime - startTime; // Calculate the duration in milliseconds

console.log(`Execution time with edge: ${durationInMs}ms`); // Print the duration to the console

return new Response(json, {
headers: {
"content-type": "application/json;charset=UTF-8",
"access-control-allow-origin": "*",
},
});
}
import { Client } from "@planetscale/database";

export const runtime = 'edge' // 'nodejs' (default) | 'edge'

const db = new Client({
url: process.env["DATABASE_URL"],
});

export async function GET() {
const startTime = new Date().getTime(); // Start timestamp

const conn = db.connection();
const subjectList = await conn.execute("SELECT name from subjects"); // Specify the type as Subject[]

const rows = subjectList.rows; // Get the "rows" property

const json = JSON.stringify(rows);

const endTime = new Date().getTime(); // End timestamp
const durationInMs = endTime - startTime; // Calculate the duration in milliseconds

console.log(`Execution time with edge: ${durationInMs}ms`); // Print the duration to the console

return new Response(json, {
headers: {
"content-type": "application/json;charset=UTF-8",
"access-control-allow-origin": "*",
},
});
}
I've timed this API call to try and see if this is any faster than using Prisma, however I noticed that I've been getting spikes in the response times and I'm assuming they're cold starts. (I've attached a screenshot). I'm pretty sure I'm doing something wrong here or misunderstood because from what I know edge functions are suppose to get rid of cold starts. If anyone can help me figure out where I went wrong or if I'm completely misunderstanding edge functions I'd appreciate it.
6 Replies
Neto
Neto15mo ago
are you running the edge route on the same region as the db? edge does have coldstarts btw does not matter if you are running on the edge if the data is too far from the user
wlvz
wlvz15mo ago
I'm pretty sure the issue comes from the fact it's a local dev server, although I could be wrong ah okay, but shouldn't they be a lot less than ~700ms compared to ~250ms? I think if I were to deploy this project to vercel then the 'cold start' time would be reduced as the dev server often recompiles on changes and runs everything under Node.
Neto
Neto15mo ago
try running on the same region as the db on vercel just in case
wlvz
wlvz15mo ago
alright tysm
iukea
iukea15mo ago
Any updates on this Curious
wlvz
wlvz15mo ago
deploying and testing it on there as opposed to a local dev server got rid of those massive cold starts now even when the server is completely cold i still get around a ~200-250ms response time changing the edge functions server to the same as my db also helped reduce the response time which was really nice although my DB and edge function locations are both in us-east even though i’m nowhere near that geographically so i want to change both my db location and my function location to somewhere closer to me and test it again then when testing on this site (https://www.awsspeedtest.com/latency) i can see that my latency to us-east-1 (the region of my db) is around ~250 - 300ms (the same as how long it takes for a request on my site from my location) and if i switch my db location to somewhere closer to me i can get that down to around ~80ms
Want results from more Discord servers?
Add your server