No, for security again, as pages.dev uses subdomains and is part of the public suffix list, browsers treat each subdomain as its own website, separated in terms of cookies, cors, etc
If you were using subpaths, browsers would assume each pages website are part of the same security context/get around some limits of CORS, allow access to cookies, etc
this is all true, but also true is that these bots are a problem. perhaps on this domain cloudflare could think of a way to be more pro-active against the bots and honey pot them to immediately get them on a block list. Or maybe they are and this is just one that got through
if some ip keeps making requests for non existant files accross multiple sub domains...say 20 times..or some threshold...thats not a signal that could be automated to trigger it as a bad actor?
How do you propose tracking that, though? Imagine if 32 users (Starlink) share an IP, and they hit 404s. If you block that IP you might be blocking 1 bad user but 31 good users who hit a 404.
because cloudflare has so much traffic going through it..at some point its the only answer to crush the bot delema that soaks up so much computer and bandwith on the internet
Bandwidth is pretty cheap to CF in most cases with their vast network, and compute is more expensive to do anti-bot stuff across thousands of machines then to just serve static assets which can be cached everywhere
Sure but at that point it's very user-specific and Cloudflare wouldn't take action to block things on your own domain when it might be blocking things for legitimate users.
i mean i do it for my domains already, i'm suggesting something just for pages.dev, but also my 2ndary conversation is the concept of cloudflare being the hub to lower bot traffic in the world in some way. It would be a form of community service in line with how it pushed back against patent trolls.