Server API endpoints are slow
Hey there, I am currently looking to optimize the speed of my api routes, and in general, they are pretty slow, so I thought of removing everything. So basically, with everything removed, an empty api request still takes about 150ms, which I find extremely long. Here is the example code:
I push to vercel with no further config.
Does anyone else have the issue? Or is 150ms just acceptable for an api request? I rember setting up basic nodejs servers in the past that would do 10-20ms for (almost) empty requests.
5 Replies
Here is a screenshot of how the request looks in the browser.
Anyone? π
Maybe this would better fit in a GitHub Discussion
GitHub
Server API endpoints (even with nothing in it) are too slow (150ms ...
Hey there, I am currently looking to optimize the speed of my api routes, and in general, they are pretty slow, so I thought of removing everything. So basically, with everything removed, an empty ...
So there are no (custom) proxies or anything, it's hosted on vercel, without any further configuration. Locally, it only takes about 7ms.
So I thought it could be because of cold starts of the vercel function, but I suppose that would be once and then for the next seconds/minutes when querying that same (empty) endpoint, it should be significantly faster.
Maybe I understood wrong how Nuxt Server functions run on vercel.
Yeah I'll try that out
Nuxt on the Edge β Vercel
Vue based SSR on the edge, powered by Nuxt 3, Nitro, and Vercel Edge Functions.
Will try that out too. Maybe it's just because I'm in germany and the default location for nuxt functions is USA
Update: Deployed on the edge, an almost empty function that just returns some headers is around 105-130ms. So a bit better, but the tradeoff that it's using vercel's beta edge network.
and you're limited to small function size, so I don't think that's it
Same with netlify, about 120-140ms response. Seems like it's not about the hosting provider itsself. Will try to deploy it to DigitalOcean next.