For now or If they do, until that point, I would suggest have a daily cron searching for data points past the 1 or 2 month mark and save it to a DB (like D1) for longer storage. Or (depending on your use case) flatten it into a daily json file (all the data points in a day) an save into object storage (like R2)
I recently discovered Workers Analytics Engine. Can someone explain to me why it is so expensive? $0.25 seems a bit high for 1 Million write-requests, especially when compared to D1!
How has the feedback been so far? 3 Months is fine for non-important data (such as referral link tracking), but for anything else we'd still need to maintain our Kafka -> Druid
Simply paying e.g. for GB/mo on everything older than 3 months would be amazing from client perspective, since it doesn't represent the increase in load for the read requests it's probably not a real solution though
I'm a noob dev here that trying to query the Zero Trust Gateway on how many queries we blocked or redirected. I have setup my API keys and I'm using the GraphiQL tool.
How can I figure out ehat GraphQL query I need to perform? Thank you!