Worrying about optimistic Update performance in UseInfiniteQuery
hey guys so my worry is basically the title, im using an InifiniteQuery that will fetch posts from the backend database using Pagination! Now the infinitequery handles all in 1 now what i mean by that is it takes into account
1) keyword filtering
2) sorting + score filtering
these 2 things are passed into the query string params when i execute the api call and thus its in the queryKey so its cached by an id, keywords array joined to a string, sort, score
now thats all fine, the concern is when i toggle saving of a post, id like to optimistically update the posts saved status, now ive got a code snippet of how im doing it in useMutation, basically how it goes is, i first cancel any queries, snapshot it, optimistically update the active query
now since the queryKey includes id(an audience ID like a folder basically with all the posts), keywords, sort, score
im optimistically update the current active query with the keywords, sort, score etc but then im invalidating all queries with just the audience(Folder) ID as those queries can also include the posts right ? say if there were no keywords, sort, score etc then the post can still exist deep inside the global general infinite Query right!
my issue is with how im doing the optimistic update, im having to map over all pages then go
post.id === id
? if yes update isSaved: !post.saved
now if the post is deep inside like 50th page or something thats a huge map 🙁 currently per page has 50 posts i know computers are fast and the JS code is also fast but i was just wondering if theres an optimal approach? like somehow knowing which page the post exists on ?
and also 1 more thing the fact that im invalidating all other queries that start with the queryKey -> id, keywords, sort, score is that justified ?
ill post the relevamy code snippet in the comment below so yall can take a look as ive run out of space here1 Reply
ambitious-aquaOP•5d ago