Streaming Limits with Vercel
So I'm trying to stream in a response from openAI, but the request times out at 10.01s on the vercel deployment due to runtime limits they have set on free tier serverless functions. My first thought was to move the endpoint to the edge, but that complicates things with auth. Any ideas on how I could make this work without signing up for premium? Thanks π
5 Replies
I think you can try auth v5 it works on edge so it won't be a issue but it's setups is bit tidious
Thatβs if you are not using prisms
Prisma
you can use prisma cloud to use it on edge via proxy https://www.prisma.io/blog/database-access-on-the-edge-8F0t1s1BqOJE
Prisma
Database access on the Edge with Next.js, Vercel and Prisma Data Proxy
Learn how you can query databases in Edge environments using Prisma and the Prisma Data Proxy.
Yes, but this requires some external service, correct? Now it's called Accelerate and not Data Proxy if I am not mistaken
@Anonymous_hacker
Accelerate is a different thing what it do it connect with our migrate_database using proxy without any engine all the db query are proxied using their server so that's why we can user prisma/client/edge on that instead of just normal prisma/client which contains query engine which don't allow to run on edge.