F
Firecrawl13mo ago
Kyo

Or is it possible to use the /crawl

Or is it possible to use the /crawl endpoint without actually scraping all URLs? And to return only a list of URLs for which I then can use the /scrape endpoint?
2 Replies
Caleb
Caleb13mo ago
hey Kyo, You can do this with the returnOnlyUrls parameter on /crawl, but it isn't any cheaper and faster than getting all the content. Regarding the first question, I think adding the include/exclude paths makes a ton of sense on map. I'll add it as a feature request
Kyo
KyoOP13mo ago
Great! So in that sense, would it just function the same way as the crawl endpoint when setting returnOnlyUrls to true? My use case/goal is to have my users input an URL and includePaths/excludePaths, and get a list of URLs to render as a list in my frontend. Then, I want my users to be able to trim that list so they end up with the URLs they actually want scraped. Then, that list I want to iterate on with scrape jobs for each URL in that list. Ideally the function to crawl these urls would be as fast as possible @Caleb would this in this case be possible with the map endpoint? And also, would it be possible to disable the "search" string that is currently available to pass in the /map endpoint?

Did you find this page helpful?