SupabaseS
Supabase2y ago
loup

Upsert big array to reduce queries

Well (road to 1304948 questions Ive asked in this Discord, sry again...), I have a Python script who gonna update my Supabase db with the TMDB API (movie api). So I wanna update my database with the API. For updating all actor, there is around 5 000 changes per day. My actual script get all id that need to be changed, and for each I make 2 request to TMDB for getting details in 2 languages about this person. After this, I make 1 supabase upsert for updating main informaiton in one table (tmdb_person) and another upsert with 2 rows in the translations table tmdb_person_translation (2 rows for english and french). But when I run the script, he take around 4h, and during that time my Supabase instance (self hosted) is completely overload. My web app can't even handle login event. So my solution is instead of making each time a 2 supabase upsert, Im gonna store locally in a variable all fetched data about person, and after make 2 finals upsert to my Supabase instance with all fetched data. Is it a good idea ?

Also, is it possible to use Postgresql function with .rpc in Python ? Because I need to make a custom upsert for the tmdb_person_translation (see this article: https://medium.com/@vovavc/how-to-upsert-data-to-supabase-postgres-database-without-using-an-id-56571f0ad984)
Medium
Was this page helpful?