workers site caching and DDoS

I am kind of new to CF. I have a workers site that renders static HTML for any subdomain using the route *.mydomain.com. It works fine as expected. Now I am unable to figure out how to enforce caching on the generated HTML so that the Worker won't be hit every time until the cached HTML is purged explicitly or CF removes the cached page. I am more worried about getting DDoS-ed. I am implementing a simple static site builder using CF Worker site as the origin and using Cloudflare SaaS for custom domains. I understand I can upgrade to Business/Enterprise and have better DDoS options. But I am wondering how are small companies dealing with unexpected large unexpected Workers bill. Edit: Small typo
1 Reply
Chang
Chang4mo ago
I tried Cache Everything page rule for *.mydomain.com but it doesn't work as expected. My worker is still being hit. Thanks!