hi, thank you for the tool

hi, thank you for the tool PlaywrightCrawler. I want to ask How to handle 429 status code caused by requesting too often? Is there a sleep-for-a-few-seconds method to handle this? Thanks for the attention.
1 Reply
Pepa J
Pepa J13mo ago
Hi @jackfsuia , Great question! Please consider using #crawlee-js #🐍apify-python channels next time, so we may provide you better answer based on the tools, that you are using. For Crawlee, there is a paramter to limit the amaount of requests in minute:
const crawler = new PuppeteerCrawler({
// ...
maxRequestsPerMinute: 10,
// ...
});
const crawler = new PuppeteerCrawler({
// ...
maxRequestsPerMinute: 10,
// ...
});
You migh also find helpful article about how to deal with rate-limiting by using proxies https://docs.apify.com/academy/anti-scraping/techniques/rate-limiting
Rate-limiting | Academy | Apify Documentation
Learn about rate-limiting, a common tactic used by websites to avoid a large and non-human rate of requests coming from a single IP address.

Did you find this page helpful?