How do you handle crawl requests sent simultaneously by different users?
Currently, I can't crawl 2 different websites from different API calls each. The second API call is just ignored and the API continues to crawl the website of the 1st request.
I'm using Express for my API.
5 Replies
correct-apricot•2y ago
create your own random https://crawlee.dev/api/types/interface/RequestQueueHeadItem#uniqueKey - otherwise when request handled its not scraped again, its a feature to prevent duplicate requests for a same data
relaxed-coralOP•2y ago
It shows the interface but I don't see how to use it or import it
correct-apricot•2y ago
yes, in request object add
uniqueKey
property, generate random uuid for it by uuidv4() or something similarrelaxed-coralOP•2y ago
Is request.uniqueKey = uuidv4() what you are referring to or do I have to manually enqueue urls to add their uniqueKey?

correct-apricot•2y ago
yep!