I have 99 urls in the queue. But scraper finishes crawl after a few urls, why?
The scraper finishes crawl after a few ones everytime. I have 99 urls added to the queue.
This is my config:


8 Replies
foreign-sapphireOP•3y ago
I add urls like this:
@Nisthar just advanced to level 2! Thanks for your contributions! 🎉
foreign-sapphireOP•3y ago
I can see the urls added inside the queue
eastern-cyan•3y ago
maybe urls are not unique
foreign-sapphireOP•3y ago
Ok that was the issue i think
Thanks 👍
genetic-orange•3y ago
@HonzaS what if I want to scrap the same url again and again ? do I need to re-initialize the crawler ?
eastern-cyan•3y ago
Then you need to have different uniqueKey for each request. By default uniqueKey is the same as url but you can set your own.
https://crawlee.dev/api/core/class/Request#uniqueKey
genetic-orange•3y ago
Thankyou very much @HonzaS