Cloudflare Developers

CD

Cloudflare Developers

Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news

Join

Take a look at the response body

Take a look at the response body

could i dm you with my current dns

could i dm you with my current dns settings?

Sometimes when doing a head request, I

Sometimes when doing a head request, I am getting the error "reason":"Bad Request","statusCode":400},"message":"UnknownError". This is happening a lot

It seems my bucket has ground to a near

It seems my bucket has ground to a near halt. I was previously uploading a 130MiB/s+ and now I'm seeing Completed 312.0 MiB/4.7 GiB (4.1 MiB/s) with 1 file(s) remaining from the aws cli. df681cd207a9e2afa394d260c486fd1e bucket tts-data

This is happening with lot of users from

This is happening with lot of users from few days. Anyway to get this unblocked or update anything from our side

Hi! I'm looking for some help with my r2

Hi! I'm looking for some help with my r2 bucket. I'm using it via Pages and a binding. I see my put request was successful but I don't see anything in the bucket in the dashboard. I also don't see anything when trying to list all objects via the AWS s3 SDK, but if i get the key in the worker it returns what looks to be the file I uploaded. Any ideas?...

Sorry, what's KV v2?

Sorry, what's KV v2?

I've been trying to connect to support

I've been trying to connect to support for hours but I couldn't connect.

just change the method to `PUT` and it'

just change the method to PUT and it'll work. You'll likely want to add If-Unmodified-Since to ensure write-once only. The rest is optional, though I would really encourage all of the headers here for various reasons. 🙂 ```typescript const signed_url = await aws_client.sign( new Request(url, {...

R2 presigned urls

We are using presigned URL and using that URL to upload with timeout of 15min, we are using files of around 8-10 GB.

It seems the best way is:

It seems the best way is: 1. Use the "List images V2" endpoint to get all images (https://developers.cloudflare.com/api/operations/cloudflare-images-list-images-v2) 2. Use the "Base image" endpoint to fetch the images (make sure you throttle somewhat) and upload to R2 (https://developers.cloudflare.com/api/operations/cloudflare-images-base-image)...

HeadBucket + Bucket-scopes tokens

We just released a new version that should fix issues with bucket-scoped tokens and rclone. HeadBucket is now allowed on a bucket-scoped token, so you should not need the no_check_bucket option. Let me know if the problem persists! I see a couple people ran into this yesterday ( @sohum @FloppyDisk 💾)...

Sippy

Hi can someone help here. I am trying to enable sippy for one of my bucket. But the curl execution fails with error {"success":false,"errors":[{"code":10063,"message":"Invalid upstream credentials"}],"messages":[],"result":null} I have checked the token is active and correct. Please help as I am completly stuck...

Question about serving JS and CSS assets

Question about serving JS and CSS assets (and probably) images from R2. What is the current perspective and the most optimal way to serve assets from R2 that are public and immutable through a custom domain? I see old threads about tiered caching behavior not working for custom domain + public bucket. Would that mean I should implement a worker to operate on my custom domain to read from R2 and respond with cache control headers to leverage tiered caching?

Posting here is preferred.

Posting here is preferred.

thanks for the reply. seeing slower

thanks for the reply. seeing slower downloads from india since yesterday. just now tested it, still the transfers are slow

Quick feedback on the S3 support of AWS

Quick feedback on the S3 support of AWS R2. The multipart support of AWS S3 says each part must be > 5mb (but the last part), and they can be of different size. But R2 insists on having each part of exactly the same size (which is not so easy to implement TBH).

Is it just me or downloads are flakey?

Is it just me or downloads are flakey?

Hi there, I believe there's a bug in the

Hi there, I believe there's a bug in the AWS S3 multipart upload compatibility. Your documentation says "The last part has no minimum size". But from my testing this is not true, if the last part is less than 5MB then R2 will fail to complete then multipart upload with EntityTooSmall.