T
TanStack3y ago
passive-yellow

How to prevent making a query, if id is removed from ids list that is used as cache key

Hi! In our app we've got queries that often involve passing lists of ids to various API endpoints. In most cases the response returns a list of data for each item separately, there's no aggregated data that depends on the list of all passed ids. The issue we're having is that if we already have data for items with ids e.g. 1, 2 and 3 and someone removes an item with id 2 from the list, we make a new query for items with ids 1 and 3, even though we already have this data, only because the list of ids has changed and it's used as a part of the cache key. Is there some easy and efficient way to figure out that we already have this data and not make a new request?
7 Replies
stuck-chocolate
stuck-chocolate3y ago
Search for batching in the help forum here. This should help you
passive-yellow
passive-yellowOP3y ago
@M00LTi Thanks! Are you sure this is something that fits my use case? It looks like this is batching multiple similar requests within a given timeframe into a single request. what I need is to prevent making requests when e.g. a request items.where(ids: [1,2,3]) is done on page load and then 5 minutes later a user remove one item and the new request would be e.g. items.where(ids: [1,3]) It would be even better (though not necessary) if it kept track of all ids already fetched and when user e.g. adds a new item with id e.g. 4, instead of making request like items.where(ids: [1,3,4]) it would make request items.where(ids: [4]), as it already has data for items with ids 1, 2, and 3.
stuck-chocolate
stuck-chocolate3y ago
GitHub
Batching · TanStack query · Discussion #365
So yesterday I wrote up some thoughts on batching (see https://github.com/tannerlinsley/react-query/discussions/364), but this was too eager of me. For proper support of batching, it would be reall...
stuck-chocolate
stuck-chocolate3y ago
Probably read for yourself to check
passive-yellow
passive-yellowOP3y ago
@M00LTi After reading the README more carefully, it looks like it's not what I need. However, I really like the API and if I don't find any library that already handles my use case, I might try to come up with a solution that has a similar API. I don't need batching, I want to get data as soon as params change. What I want is a smart caching of list items. If a user adds or remove an item from the list, I do not want to fetch data for items I already have data for.
stuck-chocolate
stuck-chocolate3y ago
I guess you could write a little intermediate function. useQuery is a bijection between cache entries and asynchronous functions x params.
passive-yellow
passive-yellowOP3y ago
I'm afraid I'd need to replicate a part of react-query cache to store merged responses and also figure out query cache for this intermediate function, so it does not depend on list ids, but only on other params... I was hoping there's a ready solution for this use case, as it doesn't seem to be that uncommon.

Did you find this page helpful?