T
TanStack•5mo ago
optimistic-gold

Caching server Functions?

Hello, To cache some server functions in my tanstack Start app I do this: the server function:
export const getPlatformStats = createServerFn({ method: "GET" })
.middleware([authMiddleware, platformStatsCacheMiddleware])
.validator(data => plaftformStatsSchema.parse(data))
.handler(async ({ data: { mediaType } }) => {
...
});
export const getPlatformStats = createServerFn({ method: "GET" })
.middleware([authMiddleware, platformStatsCacheMiddleware])
.validator(data => plaftformStatsSchema.parse(data))
.handler(async ({ data: { mediaType } }) => {
...
});
the cache middleware:
export const platformStatsCacheMiddleware = createMiddleware({ type: "function" }).server(async ({ next, data }) => {
const cacheKey = `platformStats:${JSON.stringify(data ?? null)}`;

// Cached for 24 hours
return getContainer()
.then(c => c.cacheManager.wrap(
cacheKey,
async () => next(),
{ ttl: 24 * 60 * 60 * 1000 },
));
;
});
export const platformStatsCacheMiddleware = createMiddleware({ type: "function" }).server(async ({ next, data }) => {
const cacheKey = `platformStats:${JSON.stringify(data ?? null)}`;

// Cached for 24 hours
return getContainer()
.then(c => c.cacheManager.wrap(
cacheKey,
async () => next(),
{ ttl: 24 * 60 * 60 * 1000 },
));
;
});
I'm using the npm package cache-manager under the hood, which is either initialized using Redis in prod or in memory otherwise. Is this ok ? Thanks in advance :).
9 Replies
metropolitan-bronze
metropolitan-bronze•5mo ago
why shouldnt it be okay?
optimistic-gold
optimistic-goldOP•5mo ago
just asking to be sure 🙂
metropolitan-bronze
metropolitan-bronze•5mo ago
it looks good to me it could probably be generalized a bit so you could cache arbitrary functions
passive-yellow
passive-yellow•2mo ago
I tried this approach, but it didn't work well. Since the entire return value of next() is cached, a bunch of objects are kept alive in the cache, taking a lot of memory and eventually causing OOM errors. Instead, I modified the middleware to only cache and return the result property returned from next():
const middleware = createMiddleware({ type: "function" }).server(
async ({ next, functionId, data }) => {
const cacheKey = `functionCache-${functionId}::${JSON.stringify(data)}`;
console.debug(
"Running function cache middleware for cacheKey:",
cacheKey,
);
const result = await serverCache.fetch({
cacheKey,
fetchFn: async () => {
const nextReturn = await next();
if (nextReturn.error != null) {
throw nextReturn.error;
}
return nextReturn.result;
},
options: { ttl: 60_000 * 5 },
});
return { result };
},
);
const middleware = createMiddleware({ type: "function" }).server(
async ({ next, functionId, data }) => {
const cacheKey = `functionCache-${functionId}::${JSON.stringify(data)}`;
console.debug(
"Running function cache middleware for cacheKey:",
cacheKey,
);
const result = await serverCache.fetch({
cacheKey,
fetchFn: async () => {
const nextReturn = await next();
if (nextReturn.error != null) {
throw nextReturn.error;
}
return nextReturn.result;
},
options: { ttl: 60_000 * 5 },
});
return { result };
},
);
While this seems to work, typescript complains and says: 'use functions must return the result of next()'. Am I breaking something @Manuel Schiller ? I can provide the implementation of serverCache if that's relevant, but it's basically caching promises' results
metropolitan-bronze
metropolitan-bronze•2mo ago
a full example would be nice
passive-yellow
passive-yellow•2mo ago
Here it is: https://codesandbox.io/p/devbox/heuristic-ardinghelli-tsgnpw?workspaceId=ws_A29v8CWYnC6AeiHpeerNGN You can see the content of the cache by accessing the /cache page. If the full return value from next() is cached and returned, the /cache page eventually shows very large objects.
flat-fuchsia
flat-fuchsia•2w ago
@Kuoun did you happen to have any luck with this eventually? Looking in to how to cache serverFns in middleware correctly as well without caching all the other unneccesary information...
passive-yellow
passive-yellow•2w ago
I've used it as defined above for a while now, and it seems to work fine. We currently only use it for a couple of simple functions, so I can't say anything about complex use cases.
flat-fuchsia
flat-fuchsia•2w ago
Thanks - alright. I have a couple hundred resources to cache so I’ll give it a shot and see how it goes…

Did you find this page helpful?