Cloudflare Developers

CD

Cloudflare Developers

Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news

Join

How to connect your Cloudflare R2 Storage to CyberDuck

How to connect your Cloudflare R2 Storage to CyberDuck For an easier way to play around with your R2 Bucket! If you haven't already, download your flavour (version) with the link below ⬇️ https://cyberduck.io/download...
No description

Any of the SDKs compatible with AWS S3

Any of the SDKs compatible with AWS S3 will work by changing out the URL

speed deviation with custom domain

Hello, trying to sending objects ( <= 100 MB) with custom domain. The download speed deviation is severe on the same device, COLO. (40 mb/s to 6 mb/s, HKG) I checked with several users and found that some users were very fast or slow even though COLO was different. (mostly HKG, KIX)...

I rolled out presigned URL support to my

I rolled out presigned URL support to my users (users being site owners that use my Cloudflare integration system, not end users) a few days ago and I’m already getting questions from site owners asking why presigned URLs only work over HTTP/1.1. I know the S3 API is limited to HTTP/1.1, but it would be nice/have more noticeable benefits for presigned URLs if it had the capability to do HTTP/2 or HTTP/3. You end up with end users making multiple http requests concurrently. Hopefully it’s on the roadmap. It does seem silly when a site is running over HTTP/3, but then some attachments coming via presigned URLs are HTTP/1.1...

Hi I m trying to integrate R2 into my

Hi! I'm trying to integrate R2 into my elixir app. I found this thread https://discord.com/channels/595317990191398933/1114222075268321322/1114246362284970036 but the person there says that it just worked. I keep getting 400 for streaming a file. Is that not supported yet?

also does cloudflare r2 store my data in

also, does cloudflare r2 store my data in multiple regions? (i don't refer to caching, i mean a backup)

I already try then cursor freezes and no

I already try then cursor freezes and no progress to show ,in cloudflare same storage used

Hello Unless I m missing something

Hello! Unless I'm missing something obvious, I'm pretty sure I'm running into an R2 bug. I'm using a token with "Object Read & Write" permissions and get AccessDenied when using rclone to write an object to the bucket. I know I have the access id & secret key correct because as soon as I switch the token permissions to "Admin Read & Write" privileges, it starts working and rclone will correctly write new objects to the bucket. When I switch it back, it stops working again. What I'm doing is very simple: ``` sergey@ark ~> echo Hello World > myfile.txt...

Concurrency limits

Ya... the setup is 2 buckets. One for the public stuff (and it is indeed accessed directly on a public domain and optimized appropriately with Cache Rules and Tiered Cache). That's for things intended to be public (things like user avatars). The private bucket that is accessed via the API is the one he's having a problem with. The objects there are accessed via the API because the ability to view/download those are user permission based. The application checks for the appropriate permissions for the user that's logged in and then passed it through as necessary based on the permissions. He isn't doing a crazy amount of traffic or anything... he's at ~19M class B operations this month across all his buckets. The issue here is with the private bucket... since it's user permission based, the object is passed through via API. It's no where remotely close to 1,000 concurrent read operations even across all objects/users, I'd guess maybe it peaks around 10ish....

Presigned URLs

I need some help -- I'm trying to generate a presigned upload URL with R2 and I keep getting this error: "The request signature we calculated does not match the signature you provided. Check your secret access key and signing method." My go code is as follows:...

CORS

Have been getting 403 [Error] Origin http://localhost:3000 is not allowed by Access-Control-Allow-Origin. Status code: 403 when trying to PUT to a signedUrl generated from the S3 api I'm building it this way:...

S3 API to get R2 file content

I would like to use the S3 PHP API to get the results from Cloudflare logpush. I can successfully list the objects with $client->listObjects(...). However if I want to get the content of an object with $client->getObject([ 'Bucket' => self::BUCKET_NAME, 'Key' => $object['Key'] ]);...

R2 - S3

HI everybody! Could somebody send me a simple curl example to list R2 buckets thriough S3 API. I've never used S3 and I have no idea how to to do it.

Hello anyone know how to remove domain

Hello, anyone know how to remove domain from R2? I need to use it on other bucket...but can't remove from existing. The domain was added by R2. When I try remove from DNS even, it just sends me back here.......

CORS issue with fonts

Yes I am using the put singed urls the same way, was just curious is it safe to allow all origins or not It’s an widget that people can use it on their website, that’s why I had to allow all origin to access the fonts But then why images and videos were working from the public domain, I thought everything is accessible using that custom domain...

and its still running out of cpu time

and its still running out of cpu time

Bucket-scoped tokens

I've got great news everyone: you're now able to create API tokens scoped to specific (or all) buckets! All existing tokens will continue to work and will have access to all buckets. You can edit permissions for these tokens, or create new ones to limit them to specific buckets. If you find issues with the authorization itself or the UI, please shout here! If you have other thoughts about the feature, please feel free to leave them in this thread. ...