Had no clue for this! Will give it a try for sure I do have 1Gbps connection from home and server as well Used rclone 1st time, was quite fast for approx. ~2mil files and ~160GB vs aws-cli which took me a full day for only ~50GB and ~1mil
May I also ask, since I haven't tested much yet, is there any difference for EU jurisdiction bucket, e.g. are files served from EU IP addresses? https://community.cloudflare.com/t/site-ip-address-changes-very-often/374169/4-> reminds me onto this and then I wonder if I'd have some issues, with some ISPs blocking or limiting Cloudflare ...
May I use this opportunity and ask if there is any difference, or need for this to get it work in the config file for rclone? Or just define and use R2 API Token which can access read-write for both buckets?
I asked yesterday but didn't provide an example. I have this R2 custom domain setup automatically by "connecting" it. It's returning a 301 for any URL back to the URL requested. Here's an example: https://sniff.lololo.lol/hello.txt-- I have checked every set of config for the zone and there's no page rules / redirects / etc.
May I ask could we expect somewhere in future to be able to filter buckets on the bucket list, e.g. per location / jurisdiction? Might be it's able to do it via API, but not on the Dashboard, yet. Thank you!
If we have e.g. sub-domain on bucket, example: a) mydomain.hr is main domain b) then I've got sub-domain sub.mydomain.hr c) for r2.sub.mydomain.hr -> do I need to pruchase Advanced Certificate Manager to cover that R2 Custom domain which seems to me to be an deep-level sub-domain?
when i did the rclone from digital ocean i was also seeing intermittent full hangs. not sure if that's similar to what you mentioned. at the time i couldn't tell if it was them or R2