passing json as parameter to rpc – size limit?
I'm trying to do a lot of batch processing of data from another API into Supabase.
Looping my json outside of the rpc call and inserting one row at a time is predictably slow (and wasteful in its nature), so now I've MVPed what I was trying to do, I'm looking at passing json in to the rpc as a parameter and looping inside the rpc.
However, I was wondering if there's an actual limit on the request size you can make – i.e. how many kb/mb of json can I sensibly chuck into an rpc before I need to think about paginating my requests...
Bear in mind I'm calling from an edge function but not with an end user to worry about... when's it going to fall over?:
Looping my json outside of the rpc call and inserting one row at a time is predictably slow (and wasteful in its nature), so now I've MVPed what I was trying to do, I'm looking at passing json in to the rpc as a parameter and looping inside the rpc.
However, I was wondering if there's an actual limit on the request size you can make – i.e. how many kb/mb of json can I sensibly chuck into an rpc before I need to think about paginating my requests...
Bear in mind I'm calling from an edge function but not with an end user to worry about... when's it going to fall over?:
- There's a kb/mb limit on the request size you can make – don't send more than xxx kb/mb!
- There's no limit on request size to .rpc() but you'll hit a memory limit of xxx kb/mb either on the edge function or the postgres function if you're building up big lumps of json
- It's going to fail in ways you cannot image if you send more than xxxx