How does vercel run my code? Is my cache code working as intended?

You can use something for a long time yet not know what you're working with. I just came to a realization that something I am doing may not be correct:
type Object = {
//some data
};

const CACHE_DURATION = 60 * 60 * 1000;
let cache: { timestamp: number; data: Object[] } | null = null;

export async function getObjects() {
const currentTime = Date.now();

if (cache && currentTime - cache.timestamp < CACHE_DURATION)
return cache.data;

const objects = await new Promise<Object[]>((resolve, reject) => {
https.get("https://example.com/", (res: IncomingMessage) => {
let data = "";

res.on("data", (chunk: string) => {
data += chunk;
});

res.on("end", () => {
const root = parse(data);
const servers = root.querySelectorAll("tr.server-list__row");

const objects: Object[] = [];

servers.forEach((server) => {
const data = server.querySelector("foo");
objects.push({
//data
});
});

objects.sort();

resolve(objects);
});

res.on("error", (error: Error) => {
reject(
new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message: error.message,
})
);
});
});
});

cache = { timestamp: currentTime, data: worlds };
return objects;
}

export const someRouter = createTRPCRouter({
list: protectedProcedure.query(async () => {
return await getObjects();
}),
});
type Object = {
//some data
};

const CACHE_DURATION = 60 * 60 * 1000;
let cache: { timestamp: number; data: Object[] } | null = null;

export async function getObjects() {
const currentTime = Date.now();

if (cache && currentTime - cache.timestamp < CACHE_DURATION)
return cache.data;

const objects = await new Promise<Object[]>((resolve, reject) => {
https.get("https://example.com/", (res: IncomingMessage) => {
let data = "";

res.on("data", (chunk: string) => {
data += chunk;
});

res.on("end", () => {
const root = parse(data);
const servers = root.querySelectorAll("tr.server-list__row");

const objects: Object[] = [];

servers.forEach((server) => {
const data = server.querySelector("foo");
objects.push({
//data
});
});

objects.sort();

resolve(objects);
});

res.on("error", (error: Error) => {
reject(
new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message: error.message,
})
);
});
});
});

cache = { timestamp: currentTime, data: worlds };
return objects;
}

export const someRouter = createTRPCRouter({
list: protectedProcedure.query(async () => {
return await getObjects();
}),
});
This is one of my routers. It fetches data from an external source. The goal is to cache this data so I don't have to fetch it each time my clients send a request. But I am starting to understand that serverless stuff doesn't work that way... I am using T3 and hosting on vercel. Does this code make any sense? Or should I use something like redis for this?
Solution:
I was caching the fetch data in the api route so that clients can request that data from a third party from my api. This should not be done. I used getStaticProps with revalidate to put the data into the page which means no additional request is needed. ```typescript export async function getStaticProps() { const objects = await getObjects(); ...
Jump to solution
1 Reply
Solution
NubeBuster
NubeBuster16mo ago
I was caching the fetch data in the api route so that clients can request that data from a third party from my api. This should not be done. I used getStaticProps with revalidate to put the data into the page which means no additional request is needed.
export async function getStaticProps() {
const objects = await getObjects();

return {
props: {
objects,
},
revalidate: 600,
};
}
export async function getStaticProps() {
const objects = await getObjects();

return {
props: {
objects,
},
revalidate: 600,
};
}
This solves the issue and even improves the general logic.
Want results from more Discord servers?
Add your server