Best practice to stop/crash the actor/crawler on high ratio of errors?
Following snippet works well for me, but it smells... sb have a cleaner approach?
3 Replies
rare-sapphire•2y ago
There is now some message on apify which comes I guess from the crawler when there are problems. So maybe you can use that if you find out what is generating that message.

xenial-blackOP•2y ago
This @HonzaS guy knows stuff 🙏
afraid-scarlet•2y ago
you can use stats https://crawlee.dev/api/browser-crawler/class/BrowserCrawler#stats however approach itself is not safe - you supposed to handle sessions and-or bot protection to resolve blocking by logic, not by hammering web site doing many runs. I.e. set concurrency, max request retries, logic for session.markBad etc and implement scalable crawler.