Network Connection Lost on Fetch but Fetch not Attempted

Hello, I'm trying to send large content from workers and I'm getting a network connection lost error when attempting to send anything above roughly 100Mb. I don't appear to be getting any requests on the receiving end, so I am thinking my worker isn't even attempting to send the request at all. Is there some limit I'm hitting on outbound request size from a worker? Here's some example code: export default { async fetch(request, env, ctx) { const object = await env.R2_BUCKET.get('bigfile.txt'); const putResponse = await fetch('http://webhook.site/............', { method: 'PUT', body: object.body, //body: "Hello World", headers: { 'Content-Length': object.size.toString(), 'Content-Type': 'application/octet-stream', }, }); return new Response(``, { status: 200 }); } }
3 Replies
Gravite2090
Gravite2090•4w ago
I'd guess you are running out of memory after loading a 100mb file. the workerd implementation of fetch is probably loading that thing up before sending it, and dying before establishing a connection to the remote server. Just a theory, but you can read here if you really want to figure this out: https://developers.cloudflare.com/workers/platform/limits/#memory You probably want to consider chunking or a different architecture all together here. I don't know if workers are the best choice for moving around files that large over tcp/ip.
Cloudflare Docs
Limits
Cloudflare Workers plan and platform limits.
Soren
SorenOP•4w ago
It confuses me that the exception is network connection lost and not something about running out of memory.
With ReadableStreams Fetch should be reading in chunks rather than reading the whole file into memory.
Gravite2090
Gravite2090•4w ago
Well it might be a bad guess 😉 but if your code otherwise works fine with a 3mb test file, it's probably worth checking

Did you find this page helpful?