Cloudflare Developers

CD

Cloudflare Developers

Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news

Join

Hi there, I believe there's a bug in the

Hi there, I believe there's a bug in the AWS S3 multipart upload compatibility. Your documentation says "The last part has no minimum size". But from my testing this is not true, if the last part is less than 5MB then R2 will fail to complete then multipart upload with EntityTooSmall.

Dangling domain

How to delete the r2 record?the bucket it targets has been deleted...

Vercel OG

methods allowed, GET. allowed domains, our domains & localhost ports. we added vercel og playground domain as well

HonoRequest - Hono

Hey everyone! I've got an annoying problem that I cannot solve with uploading multiple files over a form-data request to R2 with Hono. ``` const { assets, collection_name }: { assets: File[] | File; collection_name: string } = await context.req.parseBody({ all: true,...
No description

Kinesis

Ah yes, that sounds promising. There even is an example that is pretty close: https://developers.cloudflare.com/queues/examples/send-errors-to-r2/

Tokens

edit: finally figured it out (i swear i looked for a long time before posting) — if you create a normal cloudflare API token, and pick workers R2 as the scopes, it won't show an Access Key ID, even if it's transferred to be an R2-speicifc token with per-bucket access controls. you have to start at the R2 token UI in order to create a token that will show an Access Key ID for usage w/ the s3-compatible api

Like, there is a bucket someone runs to

Like, there is a bucket someone runs to serve map tiles(Protomaps). While the service is already on R2, I don’t want to hit them with the bill when I use a map, so I mirror the map files manually. Because they are super large(100 GB+), I can’t easily just do it in a Worker, so it would be cool to have some way of pointing R2 at a file and downloading it automatically(without needing rclone), or even better some way to script it

Is there any possibility that you could

Is there any possibility that you could use a worker or other sort of transform rule to either remove the spaces before storing or rewrite the url while in transit?

is there any documents or tutorial to do

is there any documents or tutorial to do it correctly? I am not good at DNS settings?

Red banners

So i created a r2 bucket and selected the "Specify jurisdiction" field and it is now throwing me this error every time going into the bucket on the dash.
No description

aws4fetch

It would be easiest to write it in JavaScript. As for how to access a file on GCS, you should be able to use a library like aws4fetch to pull the files down

is R2 able to add metadata for "put

is R2 able to add metadata for "put object" commands ? ``` import { PutObjectCommand,...
No description

Hey there, we're seeing custom domains

Hey there, we're seeing custom domains added to buckets being in ssl pending state forever. Example: {"ssl":"pending","ownership":"pending"} Is everything working ok now? Support ticket: 3014161...

R2 in LAX seems to be worthlessly slow

R2 in LAX seems to be worthlessly slow right now

Still getting 10001 error on the

Still getting 10001 error on the dashboard for newly created buckets.

Hi all, can anybody help me with adding

Hi all, can anybody help me with adding custom domain to R2. It stucks with status "Unknown"

Seems like https://r2.dev has been

Seems like https://r2.dev has been blocked AGAIN by Korean Goverment

I have cache headers from the request on

I have cache headers from the request on a mp4 file:
Cache-Control: max-age=2678400 Cf-Cache-Status: HIT
https://assets.flayks.com/4thsex-screencast.mp4...

Hi Team I need some help with building a

Hi Team, I need some help with building a custom hls streaming service using R2. I'm aware of the streaming service provided by cloudlfare but i want to build a custom one on R2. I need help to implement streaming of multiple hls files through r2. Do i need to create a presigned url for all the files or is there a better way considering my bucket is not public? Please help!!.

Dangling domain

Is there someone from CF I can give my account ID and bucket name to to have a quick look at it? 🙏