terraform {
backend "s3" {
bucket = "bucket"
key = "terraform.tfstate"
endpoint = "https://xxx.r2.cloudflarestorage.com"
skip_credentials_validation = true
skip_region_validation = true
region = "us-east-1"
access_key = "xxx"
secret_key = "xxx"
}
}These features are now available!!Future roadmap
- Public buckets, with custom domains: https://developers.cloudflare.com/r2/data-access/public-buckets/#enable-public-access-for-your-bucket
- Presigned URLs in the S3-compatible API
Read more: https://blog.cloudflare.com/r2-ga/
- Object Lifecycles
- Jurisdictional Restrictions (e.g. 'EU')
- Live Migration without Downtime (S3->R2)
rclone link will make a GetObject URLaws4fetch to generate itrclone is massively overkill - just do curl/wget with a PUT method to the presigned URLrclone is great but you're really using it since it'll do SigV4 and whatnot for you - with presigned URLs, they're already signed & only work for a single object anywaysrclone or other SDKs would do - it just abstracts it away from you nicelycurl -x PUT <url> --data-binary '123'-X PUT
rclone linkaws4fetchrclonerclonerclonewgetPUTcurl -x PUT <url> --data-binary '123'-X PUT