Server API endpoints are slow

Hey there, I am currently looking to optimize the speed of my api routes, and in general, they are pretty slow, so I thought of removing everything. So basically, with everything removed, an empty api request still takes about 150ms, which I find extremely long. Here is the example code:
export default defineEventHandler(async event => {
const startTime = +new Date()
console.log(startTime, 'start')
console.log(+new Date(), 'end', { took: +new Date() - startTime })
return true
})
export default defineEventHandler(async event => {
const startTime = +new Date()
console.log(startTime, 'start')
console.log(+new Date(), 'end', { took: +new Date() - startTime })
return true
})
I push to vercel with no further config. Does anyone else have the issue? Or is 150ms just acceptable for an api request? I rember setting up basic nodejs servers in the past that would do 10-20ms for (almost) empty requests.
No description
5 Replies
Fabian B.
Fabian B.2y ago
Here is a screenshot of how the request looks in the browser. Anyone? 😅 Maybe this would better fit in a GitHub Discussion
Fabian B.
Fabian B.2y ago
GitHub
Server API endpoints (even with nothing in it) are too slow (150ms ...
Hey there, I am currently looking to optimize the speed of my api routes, and in general, they are pretty slow, so I thought of removing everything. So basically, with everything removed, an empty ...
Fabian B.
Fabian B.2y ago
So there are no (custom) proxies or anything, it's hosted on vercel, without any further configuration. Locally, it only takes about 7ms. So I thought it could be because of cold starts of the vercel function, but I suppose that would be once and then for the next seconds/minutes when querying that same (empty) endpoint, it should be significantly faster. Maybe I understood wrong how Nuxt Server functions run on vercel. Yeah I'll try that out
Fabian B.
Fabian B.2y ago
Nuxt on the Edge – Vercel
Vue based SSR on the edge, powered by Nuxt 3, Nitro, and Vercel Edge Functions.
Fabian B.
Fabian B.2y ago
Will try that out too. Maybe it's just because I'm in germany and the default location for nuxt functions is USA Update: Deployed on the edge, an almost empty function that just returns some headers is around 105-130ms. So a bit better, but the tradeoff that it's using vercel's beta edge network. and you're limited to small function size, so I don't think that's it Same with netlify, about 120-140ms response. Seems like it's not about the hosting provider itsself. Will try to deploy it to DigitalOcean next.
Want results from more Discord servers?
Add your server
More Posts