tRPC setup for creating serverless AND edge api endpoints

Hey t3-lovers, tldr: I am trying to build an app that uses both edge AND serverless functions by creating seperate endpoints for the functionalites. Has anyone ever done this successfully? I am building an app that runs openai-API-requests and is connected to a Planetscale mySQL-DB with a prisma-layer (t3-setup). All functions are serverless. I am facing timeout issues on the openai-requests, since they are taking quite long. I am trying to switch all openai-functionalities to run on the edge. This means longer timeout-limits on vercel and the ability to stream the response. What I have tried so far: 1. Create separate context for serverless and edge 2. Create separate routers for serverless and edge 3. Create separate endpoints /pages/api/trpc/[trpc].ts and /pages/api/trpc/edge.ts that utilize the respective context & router 4. Introduce a rewrite for the routes in vercel.json: { "rewrites": [ { "source": "/api/openai/:path*", "destination": "/api/trpc/edge" } ] } I am still not 100% if this is even possible. Has anyone ever done something like this? Am I on the right track?
2 Replies
cje
cje9mo ago
IIRC the cal.com codebase does this
notoriousfeli
notoriousfeli9mo ago
any chance you could guide me to where you saw this?
Want results from more Discord servers?
Add your server