I am trying to reseting crawlee cache in nextjs what its note working can any one help me
this my nextjs code at initial request data is displayed but if i try to request again its displaying empty array or i have to restart the application
//with this
Crawl finished. Final request statistics: {"requestsFinished":0,"requestsFailed":0,"retryHistogram":[],"requestAvgFailedDurationMillis":null,"requestAvgFinishedDurationMillis":null,"requestsFinishedPerMinute":0,"requestsFailedPerMinute":0,"requestTotalDurationMillis":0,"requestsTotal":0,"crawlerRuntimeMillis":445}
// scraper code
const config = Configuration.getGlobalConfig();
config.set('purgeOnStart', false);
config.set('persistStorage', false);
export const ovaLatest = async () => {
const data: { title: string; }[] = [];
const crawler = new CheerioCrawler({
async requestHandler({ $, enqueueLinks,request }): Promise<void> {
console.log(request.url)
const latest = $('div#left-div')
latest.find('div.home-post-titles > h2').each((index, el) => {
data.push({ title: $(el).text().trim() });
});
// await enqueueLinks();
}
});
await crawler.run(['https://www.xyz.com/page/4']);
console.log(data)
return { data: data }
}
// nextjs api route code
export default async function handler(
req: NextApiRequest,
res: NextApiResponse<Data>
) {
const data = await xyzLatest()
res.status(200).json(data)
}
3 Replies
flat-fuchsia•3y ago
Hey man ! 👋
You could try the keepAlive: false or crawler.teardown() once you're done. I'm really not sure this is the optimal solution, but might work.

flat-fuchsia•3y ago
Just pitching ideas, I have no ideas if it would work: An other solution could be to initialize the cheerioCrawler() once. Then everytime you need to run it, you could to crawler.addRequest(), then crawler.run().
inland-turquoiseOP•3y ago
thanks bro i will try it