Managing High Bandwidth Consumption in Web Hosting and Cloud Storage
When using a web hosting service like Netlify or Vercel, it's advised to store files in cloud storage solutions such as Azure Blob or Amazon S3 to mitigate bandwidth surges and control costs during an attack or excessive file requests. However, this concern extends to cloud-hosted files as well. Attackers can target these files with repeated requests. Are there effective solutions to this problem? Or are the associated costs with cloud storage so minimal that this issue is rarely a concern?
4 Replies
the big thing is the file size
not the file itself
a js bundle is some kbs and can be easily cacheable with response headers
now large files like images, videos or photos, are way more tricky
they easily can rack up a big bill before being flagged as "bad accessess"
neither vercel nor netlify are CDNs like s3 + cloudfront
yea, but someone could also repeatedly download my s3 files and rack up the bill that way. Are there any protections against that or are the costs just so insignificant that this rarley/never does happen.
Both you and the provider should handle that
Always search what each one of them do
Depending on your usecase and/or provider you could:
- allow files only to be viewed through your website
- make files private - require access-token whenever you view a file
- most providers have some type of DDoS protection (although you will probably have to pay)
- rate limit it yourself
None of these are perfect, but they can at least help