[self-hosted] Is there a way to get more logs or to change the logging level?

Is there a way to get more information on failed scrape requests when running a self-hosted firecrawl instance? I'm often getting an error which is not very self-explaining (the scraped url is not broken!) This is the full error message: Error: All scraping engines failed! -- Double check the URL to make sure it's not broken. If the issue persists, contact us at help@firecrawl.com. at scrapeURLLoop (/app/dist/src/scraper/scrapeURL/index.js:297:15) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async scrapeURL (/app/dist/src/scraper/scrapeURL/index.js:347:24) at async runWebScraper (/app/dist/src/main/runWebScraper.js:67:24) at async startWebScraperPipeline (/app/dist/src/main/runWebScraper.js:11:12) at async processJob (/app/dist/src/services/queue-worker.js:923:26) at async processJobInternal (/app/dist/src/services/queue-worker.js:323:28)
0 Replies
No replies yetBe the first to reply to this messageJoin

Did you find this page helpful?