is there a way to close browser in puppeteer crawler?

my crawler got stuck getting request timeouts with concurrency of 20, if i could close browser on request timeout that could solve the issue.
No description
2 Replies
rare-sapphire
rare-sapphire3y ago
it supposed to be higher level logic: either manage session. max pages and retries in crawler or run crawler with batches of URLs, sounds like your crawler blocked per web visitor
harsh-harlequin
harsh-harlequin3y ago
You can simply throw error or return from requestHandler. I would reduce concurrency to 1 and try to debug where it gets stuck

Did you find this page helpful?