Caching for fetching multiple ids in different batches
Hi,
I would like to know if something like this is already build-in or doable wihout building a new feature.
Context:
Image that you have an api that returns users, you have 2 way of fetching them:
1. By id
2. By batch of ids
In my app, I sometimes fetch by id or by batch of ids depending on the context.
With Tanstack query, is there a way to cach the user data by id, so that when I batch request ids, it checks if another batches already fetched some/all the ids ?
So I could do :
I know that I could use
useQueries()
but I want to make as few callls as possible to the api 🙂
Question:
Is there a solution for this in tanstack query (Maybe tanstack db ?).
My guess is that I could make my own cache layer outside of tanstack query, and add a layer in the queryFn
that checks the cache and filter the ids before calling the real queryFn
.2 Replies
adverse-sapphire•2w ago
Take a look through this article. This covers how to seed one queries cache using the response from another which is effectively what you are looking for by the sounds of it
https://tkdodo.eu/blog/seeding-the-query-cache
I'm not sure if there is an alternative built in, but at my company we often do a push approach for queries from our list endpoint to individual endpoints
Seeding the Query Cache
With suspense for data fetching on the horizon, it is now more important than ever to make sure your cache is seeded properly to avoid fetch waterfalls.
vicious-goldOP•2w ago
Oh interesting, thank you !
For anyone interesting, I ended up with this :