I have 99 urls in the queue. But scraper finishes crawl after a few urls, why?

The scraper finishes crawl after a few ones everytime. I have 99 urls added to the queue. This is my config:
No description
No description
8 Replies
foreign-sapphire
foreign-sapphireOP3y ago
I add urls like this:
const playRequestQ = await RequestQueue.open('q1');
const urlsToAdd: { url: string; }[] = []
nodeEvts.on('links', onLinks)

async function onLinks(link: any;) {
urlsToAdd.push({ url: `${link}` })
if (urlsToAdd.length % 99 === 0) {
await playRequestQ.addRequests(urlsToAdd)
crawler.run();
}
}
const playRequestQ = await RequestQueue.open('q1');
const urlsToAdd: { url: string; }[] = []
nodeEvts.on('links', onLinks)

async function onLinks(link: any;) {
urlsToAdd.push({ url: `${link}` })
if (urlsToAdd.length % 99 === 0) {
await playRequestQ.addRequests(urlsToAdd)
crawler.run();
}
}
MEE6
MEE63y ago
@Nisthar just advanced to level 2! Thanks for your contributions! 🎉
foreign-sapphire
foreign-sapphireOP3y ago
I can see the urls added inside the queue
eastern-cyan
eastern-cyan3y ago
maybe urls are not unique
foreign-sapphire
foreign-sapphireOP3y ago
Ok that was the issue i think Thanks 👍
genetic-orange
genetic-orange3y ago
@HonzaS what if I want to scrap the same url again and again ? do I need to re-initialize the crawler ?
eastern-cyan
eastern-cyan3y ago
Then you need to have different uniqueKey for each request. By default uniqueKey is the same as url but you can set your own. https://crawlee.dev/api/core/class/Request#uniqueKey
genetic-orange
genetic-orange3y ago
Thankyou very much @HonzaS

Did you find this page helpful?