Memory limits
Hey, quick question about “Maximum state that can be persisted per Workflow instance”.
In my case, I’m dealing with files that can grow way beyond 128GB during a workflow run. Because of that, I end up holding a lot of data in memory, and the workflow crashes before even finishing the step, since the in-memory buffer itself already exceeds the memory limit.
What’s confusing me is that the docs say that on the paid plan, a workflow instance can persist up to 1GB of state.
So I’m trying to understand:
Is that 1GB limit only for persisted state between steps?
Or does it mean a workflow can actually work with large amounts of data in memory as long as it’s not persisted?
Are there any recommended patterns for processing very large files in workflows without blowing up memory?
a example of what I’m doing inside a step:
doc from here: https://developers.cloudflare.com/workflows/reference/limits/
In my case, I’m dealing with files that can grow way beyond 128GB during a workflow run. Because of that, I end up holding a lot of data in memory, and the workflow crashes before even finishing the step, since the in-memory buffer itself already exceeds the memory limit.
What’s confusing me is that the docs say that on the paid plan, a workflow instance can persist up to 1GB of state.
So I’m trying to understand:
Is that 1GB limit only for persisted state between steps?
Or does it mean a workflow can actually work with large amounts of data in memory as long as it’s not persisted?
Are there any recommended patterns for processing very large files in workflows without blowing up memory?
a example of what I’m doing inside a step:
doc from here: https://developers.cloudflare.com/workflows/reference/limits/
Cloudflare Docs
Limits that apply to authoring, deploying, and running Workflows are detailed below.
