Yep, but I'm looking to do something like `batch.queue === env.FOO_QUEUE.name`
Yep, but I'm looking to do something like
batch.queue === env.FOO_QUEUE.namebatch.queue === env.FOO_QUEUE.namestartsWith uglinesssend() methods. wrangler.toml it wouldn't be too bad?


Both max_batch_size and max_batch_timeout work together. Whichever limit is reached first will trigger the delivery of a batch.
d1a94a7c4e4a4c23a6dbc74a8abbad11 if it helpsmax_concurrency on 1 but if the 10 messages in the batch complete faster than in 1 second then it doesn't workawait wait(1000) to wait a second


try/catch on my consumer worker so I was swallowing the errors myself haha. Turns out I was the unreliable one and queues works great max_concurrency to 1 so that your consumer will be invoked only one at a time. max_batch_size setting. 
startsWithsend()export default {
async queue(batch) {
const ntfyUrl = 'https://ntfy.sh/doofy';
const message = batch.messages[0];
if (message) {
try {
const response = await fetch(ntfyUrl, {
method: 'POST',
body: message.body,
});
if (response.ok) message.ack();
} catch (error) {
console.error(`Failed to process message: ${error}`);
}
}
},
};