Cloudflare DevelopersCD
Cloudflare Developers7d ago
1 reply
China

Memory limits

Hey, quick question about “Maximum state that can be persisted per Workflow instance.

In my case, I’m dealing with files that can grow way beyond 128GB during a workflow run. Because of that, I end up holding a lot of data in memory, and the workflow crashes before even finishing the step, since the in-memory buffer itself already exceeds the memory limit.

What’s confusing me is that the docs say that on the paid plan, a workflow instance can persist up to 1GB of state.

So I’m trying to understand:

Is that 1GB limit only for persisted state between steps?
Or does it mean a workflow can actually work with large amounts of data in memory as long as it’s not persisted?
Are there any recommended patterns for processing very large files in workflows without blowing up memory?

a example of what I’m doing inside a step:
const tempFileR2Path = await step.do("fetch-file-and-store-temp", MODERATE_RETRY_OPTIONS, async () => {
        logger.info(`Fetching file for processing`, { fileId });
        const buffer = await this.context.fileProcessingStorage.fetchOrThrow(fileId, filename);

        logger.info(`File fetched successfully, storing in temp location`, { fileId });
        await this.context.pageContentStorage.storeTempOriginal(fileId, buffer, { contentType: mimeType });

        const tempPath = PageContentBucketService.buildTempOriginalPath(fileId);
        logger.info(`File stored in temporary location`, { fileId, tempPath });

        this.context.cleanupStatusRecord.cleanupTempOriginalFile = true;
        return tempPath;
      });


doc from here: https://developers.cloudflare.com/workflows/reference/limits/
Cloudflare Docs
Limits that apply to authoring, deploying, and running Workflows are detailed below.
Limits
Was this page helpful?