T
TanStack•2w ago
vicious-gold

Caching for fetching multiple ids in different batches

Hi, I would like to know if something like this is already build-in or doable wihout building a new feature. Context: Image that you have an api that returns users, you have 2 way of fetching them: 1. By id 2. By batch of ids In my app, I sometimes fetch by id or by batch of ids depending on the context. With Tanstack query, is there a way to cach the user data by id, so that when I batch request ids, it checks if another batches already fetched some/all the ids ? So I could do :
// image an simple implementation of the `useUsers()` hook who returns `useQuery()`

const users = useUsers([1, 2])

// In another component
const users u useUsers([2, 3])
// ^ Only the id `3` would actually reach the server, as the id `2` was already cached by tanstack query
// image an simple implementation of the `useUsers()` hook who returns `useQuery()`

const users = useUsers([1, 2])

// In another component
const users u useUsers([2, 3])
// ^ Only the id `3` would actually reach the server, as the id `2` was already cached by tanstack query
I know that I could use useQueries() but I want to make as few callls as possible to the api 🙂 Question: Is there a solution for this in tanstack query (Maybe tanstack db ?). My guess is that I could make my own cache layer outside of tanstack query, and add a layer in the queryFn that checks the cache and filter the ids before calling the real queryFn.
2 Replies
adverse-sapphire
adverse-sapphire•2w ago
Take a look through this article. This covers how to seed one queries cache using the response from another which is effectively what you are looking for by the sounds of it https://tkdodo.eu/blog/seeding-the-query-cache I'm not sure if there is an alternative built in, but at my company we often do a push approach for queries from our list endpoint to individual endpoints
Seeding the Query Cache
With suspense for data fetching on the horizon, it is now more important than ever to make sure your cache is seeded properly to avoid fetch waterfalls.
vicious-gold
vicious-goldOP•2w ago
Oh interesting, thank you ! For anyone interesting, I ended up with this :
const getUserIdsKey = /** */;
const getUserIdKey = /** */;

export const useUsers = (
ids: string[]
) =>
queryOptions({
queryKey: getUserIdsKey(ids),
queryFn: async (ctx) => {
if (!ids.length) return [];

const cached = ids
.map((id) => ctx.client.getQueryData(getUserIdKey(id)))
.filter(Boolean);
const cachedIds = cached.map((user) => user.id);

const notCachedIds = ids.filter((id) => !cachedIds.includes(id));

if (!notCachedIds.length) {
return cached;
}

const newData = await $getUsersByIds(notCachedIds);

newData.forEach((user) =>
ctx.client.setQueryData(getUserIdKey(user.id), user)
);

return [...newData, ...cached];
},
});
const getUserIdsKey = /** */;
const getUserIdKey = /** */;

export const useUsers = (
ids: string[]
) =>
queryOptions({
queryKey: getUserIdsKey(ids),
queryFn: async (ctx) => {
if (!ids.length) return [];

const cached = ids
.map((id) => ctx.client.getQueryData(getUserIdKey(id)))
.filter(Boolean);
const cachedIds = cached.map((user) => user.id);

const notCachedIds = ids.filter((id) => !cachedIds.includes(id));

if (!notCachedIds.length) {
return cached;
}

const newData = await $getUsersByIds(notCachedIds);

newData.forEach((user) =>
ctx.client.setQueryData(getUserIdKey(user.id), user)
);

return [...newData, ...cached];
},
});

Did you find this page helpful?