Optimistic Updates with Multiple Cache Keys in TanStack Query
Hi everyone! 
I'm working on a feature to fetch and display a list of equipment items. The query supports pagination and filtering, which results in dynamic query keys (e.g., ["equipment/fetchEquipments", pageSize, page, serialNumber, conditions]).
When I perform a mutation (e.g., deleting or updating an item), the backend takes a few seconds to synchronize the changes. In the meantime, I need to optimistically update the cached data across all dynamic query keys to ensure the UI reflects the latest state.
Here’s what I’ve implemented so far:
setQueryData: Works well for a specific query key but doesn't support filtering or targeting multiple dynamic keys.
Invalidating Queries: Leads to unnecessary refetching, which I'd like to avoid for better performance.
Example Code:
I'm working on a feature to fetch and display a list of equipment items. The query supports pagination and filtering, which results in dynamic query keys (e.g., ["equipment/fetchEquipments", pageSize, page, serialNumber, conditions]).
When I perform a mutation (e.g., deleting or updating an item), the backend takes a few seconds to synchronize the changes. In the meantime, I need to optimistically update the cached data across all dynamic query keys to ensure the UI reflects the latest state.
Here’s what I’ve implemented so far:
setQueryData: Works well for a specific query key but doesn't support filtering or targeting multiple dynamic keys.
Invalidating Queries: Leads to unnecessary refetching, which I'd like to avoid for better performance.
Example Code:
export const useFetchEquipments = ({ pageSize, page, serialNumber, conditions }: FetchEquipmentsRequest) => {
return useQuery<ApiResponse<Equipment[]>>({
queryKey: ["equipment/fetchEquipments", pageSize, page, serialNumber, conditions],
queryFn: async () => {
const { data } = await api.get<ApiResponse<Equipment[]>>("/equipment/all", {
params: { pageSize, page, serialNumber, conditions },
});
return data;
},
staleTime: 5 * 60 * 1000,
});
};