All requests from the queue have been processed, the crawler will shut down.

I'm working on news web crawler, and setting purgeOnStart=false so that I don't scrape duplicated news, however sometimes in some cases I got the message "All requests from the queue have been processed, the crawler will shut down." and the crawler don't run, any suggestion to fix this issue??
7 Replies
Hall
Hall•8mo ago
View post on community site
This post has been pushed to the community knowledgebase. Any replies in this thread will be synced to the community site.
Apify Community
rival-black
rival-black•8mo ago
the message means that all requests in your requestQueue are already handled, so there is no point to process them again
deep-jade
deep-jadeOP•8mo ago
yes I know that, however how to add more urls to the requestQueue, because the domains I'm collecting the data from have more
rival-black
rival-black•8mo ago
You can add same url bit then you need to specify different uniqueKey for them request. By default uniqueKey is the same as url
deep-jade
deep-jadeOP•8mo ago
Alright I'll try that
MEE6
MEE6•8mo ago
@mktr just advanced to level 1! Thanks for your contributions! 🎉
deep-jade
deep-jadeOP•8mo ago
Thanks

Did you find this page helpful?