I really would like to use Cloudflare Queues instead of a third party provider, but consumer locatio

I really would like to use Cloudflare Queues instead of a third party provider, but consumer location not being affected by smart placement (round trips become a huge issue), concurrent consumers only scaling at the end of a batch and the 6 simultaneous open connections per worker instance result in concurrency autoscaling not working as expected and taking too long to scale up. My queue backlog becomes huge. I get that the magic of autoscaling would be great but reading people complaining about the same thing shows that we are not there yet, or maybe we are just holding it wrong. I believe things would be way better if consumers would scale up as messages comes in, or if we had a min_concurrency setting (of course I don't know how viable it would be for us to have those). I'm really frustrated by the results of trying to use Queues again and again since the beta and still hitting the same problem.
17 Replies
Unknown User
Unknown User2w ago
Message Not Public
Sign In & Join Server To View
Nathan Scheele
Hi there! Looking for some help diagnosing a weird issue I've been running into. I am pulling data from an external API and then sending potentially 1k+ updates to Airtable. In order to circumvent the 1k subrequest limit, I'm caching the updates in KV storage, then sending the key to Queues to actually perform the updates. However, some messages seem to never be received by the consumer and are instead sent directly to the configured DLQ. I have not been able to reproduce this locally, it is only happening once deployed. Has anyone experienced something similar and/or have suggestions for debugging?
Dosha
Dosha2w ago
Amazing, keep it up
LordSilver
LordSilver2w ago
why queues isn't free give at least 1000 free monthly messages so someone can test stuff
o7
o72w ago
a small free tier for queues would be pretty cool
Pranshu Maheshwari
A small update from last week; we've increased the limits for HTTP Pull consumers. You can now pull & ack up to 5k messages / s per queue: https://developers.cloudflare.com/changelog/2025-04-17-pull-consumer-limits/
gruntlord6
gruntlord62w ago
You can also have a durable object with an alarm if you just need to break up the requests. Alarms are able to be programmatically created so you can make a cron that triggers at whatever interval you want, then have the durable object handle however many tasks you need, then set an alarm for next execution. There’s a task I have now that works like this, I may in the future move it to a queue as it seems like there’s a possibility it might simplify the task, but right now it seemed more simple to just use alarms for my use case
aza547
aza5472w ago
Thanks for the suggestion. Haven't really looked into durable objects but I implemented this in queues the other days and it's working nicely.
gruntlord6
gruntlord62w ago
glad to hear you got it figured out
Alec | Kriebel LLC.
I'm getting Queue sendBatch failed: Bad Request despite my requests being correctly shaped. I am sending 1000's of messages in multiple parallel sendBatch requests however. Is it possible this is the 5000 messages produced per second limit instead?
Pawan | Container Shill
Hey everyone, I want to use cloudflare queues to offload all anthropic amazon bedrock calls, this is essential to an app I am building where user calls amazon bedrock from the backend and it gives an output after some time since it is a hefty API call, please help the app is written using the tech stack as follows, - Python 3.12 - FastAPI - SQLAlchemy ORM
Unknown User
Unknown User6d ago
Message Not Public
Sign In & Join Server To View
Pawan | Container Shill
yeah, I need to implement cloudflare queues instead of managing another service like kafka or rabbitmq to manage hefty amazon bedrock calls
Unknown User
Unknown User6d ago
Message Not Public
Sign In & Join Server To View
kleinpetr
kleinpetr6d ago
Is it possible to use default custom serialization within the queue? For example superjson?
Unknown User
Unknown User6d ago
Message Not Public
Sign In & Join Server To View
Pranshu Maheshwari
Hey everyone! When developing locally, you can now run separate producer and consumer Workers, with both connected to the same queue. Here’s a guide for how to do this: https://developers.cloudflare.com/queues/configuration/local-development/#separating-producer--consumer-workers
Cloudflare Docs
Local Development

Did you find this page helpful?