How do you handle crawl requests sent simultaneously by different users?

Currently, I can't crawl 2 different websites from different API calls each. The second API call is just ignored and the API continues to crawl the website of the 1st request. I'm using Express for my API.
5 Replies
correct-apricot
correct-apricot2y ago
create your own random https://crawlee.dev/api/types/interface/RequestQueueHeadItem#uniqueKey - otherwise when request handled its not scraped again, its a feature to prevent duplicate requests for a same data
relaxed-coral
relaxed-coralOP2y ago
It shows the interface but I don't see how to use it or import it
correct-apricot
correct-apricot2y ago
yes, in request object add uniqueKey property, generate random uuid for it by uuidv4() or something similar
relaxed-coral
relaxed-coralOP2y ago
Is request.uniqueKey = uuidv4() what you are referring to or do I have to manually enqueue urls to add their uniqueKey?
No description
correct-apricot
correct-apricot2y ago
yep!

Did you find this page helpful?