Handle browser failure

I have Puppeteer scraper that is doing lots of actions on a page, at one point the browser fails. It's a page with infinite scroll and I have to click a button and scroll down. After 70-80 interactions the browser crashes, and the request is getting retried as usual. The main idea is that with those actions I'm collecting urls that I wan't to navigate. I want to somehow handle the browser crashing so I can start with those urls when the browser crashes.
1 Reply
Pepa J
Pepa J2y ago
Hi @NeoNomade Is there any error, related to the crashing? otherwise you may just accumulate the urls in some kind of global contet and enqueue them in the errorHandler.

Did you find this page helpful?