All requests from the queue have been processed, the crawler will shut down.
I'm working on news web crawler, and setting
purgeOnStart=false
so that I don't scrape duplicated news, however sometimes in some cases I got the message "All requests from the queue have been processed, the crawler will shut down." and the crawler don't run, any suggestion to fix this issue??7 Replies
View post on community site
This post has been pushed to the community knowledgebase. Any replies in this thread will be synced to the community site.
Apify Community
rival-black•8mo ago
the message means that all requests in your requestQueue are already handled, so there is no point to process them again
deep-jadeOP•8mo ago
yes I know that, however how to add more urls to the requestQueue, because the domains I'm collecting the data from have more
rival-black•8mo ago
You can add same url bit then you need to specify different uniqueKey for them request. By default uniqueKey is the same as url
deep-jadeOP•8mo ago
Alright I'll try that
@mktr just advanced to level 1! Thanks for your contributions! 🎉
deep-jadeOP•8mo ago
Thanks