Handling Concurrency and Rate Limiting for Batch API Calls
Hi - if I have a backend api and a ‘/batch’ endpoint that batch processes many api calls, what primitive should I be looking to handle concurrency and rate limiting? Could I keep it as simple as piping a delay to the ‘getBatchIds’ which calls ‘getId’ on an iterable? Or there are rate limiter, streams, queues. I don’t need to make it complicated so preferably I can just create a random delay between the iterations
