I had a similar issue that I think was due to the 1000 max subrequests per worker invocation. I thin

I had a similar issue that I think was due to the 1000 max subrequests per worker invocation. I think the queue send counts as a subrequest. https://developers.cloudflare.com/workers/platform/limits/#how-many-subrequests-can-i-make

That doesn't add up with 2600 objects though... that would only be 26 requests with a batch size of 100. 128 MB is the memory limit... but I assume that wouldn't be hit when you are streaming the body.

With sendBatch, there is a 256KB max size... maybe you are hitting that at a certain point in your data set and reducing the number of objects in sendBatch would fix?
A batch can contain up to 100 messages, though items are limited to 128 KB each, and the total size of the array cannot exceed 256 KB.
https://developers.cloudflare.com/queues/platform/javascript-apis/#queue

For the subrequest limit, I solved it by storing the uploaded csv on R2 and then using another queue to read the csv. That queue handler would track how many rows it processed and then once it reached 750 (arbitrary number to leave buffer under the 1000 limit) it would enqueue another task to read the csv - passing the number of rows to skip and the path to the csv on R2 in the queue message data. I also used a Durable Object to track the progress and push updates via websocket to the client that uploaded the csv.
Was this page helpful?