I'm working on a project where I need to upload videos to Cloudflare using JavaScript. I found a code that works fine for normal-sized videos, but I've run into an issue when trying to upload videos larger than 200 MB.
I've tried several solutions, but I haven't yet found an effective way to handle the uploading of large videos. Does anyone have an example or can you guide me on what I might be missing or how I can approach this issue?
Do you have any examples that can help me? I would appreciate it, my problem is that it does not allow me to upload videos larger than 200MB, if the video is less it uploads it without inconvenience
For uploading files more than 300 MB i need to use workers, I have seen videos that are too complex to implement and integrate, is there any chance for increasing uploading limit in R2 itself for about an 5GB
For that when I seen some yt videos and developer blog I found that it can be done by workers somthing called wangler I tried commands as in the yt , unfortunately failed
Hello there, I'm new to R2. I want upload to upload static files like .webp or .png and access them. I have already uploaded the files. Then I have connected a custom domain with my bucket with status active. But I'm trying to access the files but I can't. Also, Public URL Access is 'Not allowed'. Not sure what I'm missing
Akeeba:internalGetBucket(): [500] Unauthorized:SigV2 authorization is not supported. Please use SigV4 instead.
Hi I am trying to configure R2 with this Joomla plugin but I get the following error: "Akeeba:internalGetBucket(): [500] Unauthorized:SigV2 authorization is not supported. Please use SigV4 instead."
This plugin works perfectly with AWS S3 and with DigitalOcean's Space (which is also an S3 compatible service).
Hi! I have an R2 bucket which I created with public access and a public R2.dev subdomain. I uploaded some images and used the dev url to fetch the images in the website. It was working fine till last month. Recently, it started behaving weirdly. Most of the time the image links are not working except suddenly sometimes. Does anyone else have this issue?
So apparently if someone knows / guesses the name of your S3 bucket - even if it's private (!) - they can just bankrupt you by sending infinite PUT requests and there is nothing you can do about it.
requests get rejected but AWS still counts it as a write operation against…
Hi guys, I need your help, I uploaded my files to R2 and connected my domain to make it public But the creation of the URL between folders and files is being separated by '%2F'
I'm not able to consume the api in r2's postman does anyone know if there's a little secret every time it gives a different 403 asking for something different