Large File Convertion

Hi, i am implementing a feature that requires doing file conversion, and the file can be > 30mb. a single worker definetely can't handle it. haven't thought everything through, but in general i think the entire process can be divided into multiple steps (e.g. read, convert, store ...) is workflow a good solution for this kind of use case? otherwise can i have some suggestion. thanks
5 Replies
avenceslau
avenceslau5d ago
Workflows have those same limitations as workers for now. But you can leverage containers and run whatever steps that would potentially exceed limits inside a container. This is not ideal, but you will have the both of both worlds IMO. The replayability and durability of workflows and the flexibility of steps.
Yuan
YuanOP5d ago
yeah, aws fargate was my initial thought, but considering i am using cloudflare KV for storage, it will be a lot easier if i can do the process in the woker/do/workflow cloudflare doesn't provide things similar to fargate/ecs on aws, does it?
avenceslau
avenceslau5d ago
We do provide something similar although it is still in beta https://developers.cloudflare.com/containers/
Cloudflare Docs
Containers (Beta)
Run code written in any programming language, built for any runtime, as part of apps built on Workers.
.r20
.r205d ago
you might be able to will durable objects to work for this use-case
Yuan
YuanOP5d ago
thanks for the replies, i have decided to use aws s3 and ecs

Did you find this page helpful?