S
Supabase5h ago
tomi

[URGENT] supabase queue + cron

Hi I’m building an app that needs to process analysis for multiple users. I want to use Supabase Queues + Cron to handle this efficiently. [A] Here’s my setup: 1. I have an Edge Function that processes analysis for a single user (takes ~30s max due to OpenAI calls) 2. I need to process 1000s of users daily 3. I want to use Supabase Queues to ensure users and process them in batches 4. I want to use Supabase Cron to trigger the processing every few minutes. [B] Questions: 1. Is this the right approach for batch processing with Supabase Queues? 2. What’s the recommended batch size to stay under the 60s Edge Function timeout? 3. For local development, should I use the official PGMQ or create a simple database table? 4. Any best practices for handling OpenAI API timeouts in queue processing?
3 Replies
tomi
tomiOP5h ago
@helpers
ihm40
ihm405h ago
This is my best guess given your situation but I think what these docs show for processing messages from the queue is what you are looking for https://supabase.com/docs/guides/queues/consuming-messages-with-edge-functions where in your case your processing logic will involve OpenAI calls. The best practise is to not delete the message from the queue until it completes processing. If open AI is taking too long or your edge function is out for too long then you can just run it again when the cron job next calls the functions. The docs use 5 messages as an example but honestly the batch size is heavily dependent on how long your process works and i think you will have to experiment and monitor this (i suggest logging start and end times within edge functions to monitor this)
Consuming Supabase Queue Messages with Edge Functions | Supabase Docs
Learn how to consume Supabase Queue messages server-side with a Supabase Edge Function
ihm40
ihm405h ago
I haven't used pgmq for local development so i can't speak to it but in general i would mimic your production as close as possible

Did you find this page helpful?