Efficient caching strategy for querying hundreds of records

My use case involves clients querying an API to retrieve records for up to 200 IDs at a time. To minimize the number of requests and without relying on HTTP/2 multiplexing for hundreds of GETs, what's the optimal strategy here to cache each individual ID as it's retrieved so I'm not hitting the db every time? Should I send POST batches of 50 IDs and perform 50 cacheable GET subqueries on the Worker?
6 Replies
bitte
bitte12mo ago
Workers KV would hit overage far quicker as far as I can see.
Hello, I’m Allie!
Are the IDs they query always the same? One of the big issues I see with this is that if most of the queries are unique, or aren’t requested very often, then they will be evicted before they are ever reused
bitte
bitte12mo ago
The IDs are variable and unrelated to each other, so simply caching the batch request would be useless. I also can't find an exact answer on how many responses can be cached at one time on either the Free or Pro plans, so I don't know if that subquery plan would be viable either.
Hello, I’m Allie!
Unlimited cached responses, but they can be evicted at any time
bitte
bitte12mo ago
Is eviction effectively random or does it depend on things like total cache usage?
Want results from more Discord servers?
Add your server
More Posts