Hello everyone,
I'm currently building a DJ record pool / music library platform where users can download music files and production-related content (e.g. project files, sample packs, stems).
Some of these files can be quite large (up to ~5 GB per file).
My goal is to host everything on Cloudflare R2 and allow users to simply click a download button in the frontend and receive the full file via a direct download.
At the moment, files are being delivered through a Cloudflare Worker, which I suspect might be part of the limitation.
However, my developer is concerned that this is not reliably achievable with Cloudflare, especially for large files.
He suggested that multipart uploads could be problematic in cases of unstable internet connections, potentially leading to failed or corrupted uploads.
From my understanding, multipart uploads with retries and resumability should actually improve reliability, not reduce it.
So my questions are:
1. What is the recommended architecture for handling large file uploads (~5 GB) with Cloudlare R2?
2. Is direct-to-R2 upload via presigned URLs + multipart upload the correct approach?
3. How can we ensure upload reliability (resume, retry, integrity checks)?
4. Is serving files through a Worker a bottleneck for large downloads, and should downloads instead be served directly from R2?
My developer is currently pushing for an AWS-based solution (S3), but I would strongly prefer to stay within the Cloudflare ecosystem if this can be implemented in a robust way.
Any guidance, best practices, or real-world experience would be highly appreciated.
Thanks in advance!