F
Firecrawl16mo ago
jchow

Self-host: unable to scrape/crawl, "Unauthorized" error

Hi all, I'm trying to self-host on a Ubuntu system. Every time I run a crawl or scrape cURL request, I get {"error":"Unauthorized"}{base}, regardless of whatever URL I'm trying to crawl/scrape. My environment.env is basically a copy/paste of the example, except I've set USE_DB_AUTHENTICATION=false, since I'm not using supabase. I've also opened port 6379 and 3002 in my firewall, so I can't see why there would be any permissions issues. Anybody with suggestions on what I should do next?
1 Reply
Adobe.Flash
Adobe.Flash15mo ago
ccing @rafaelmiller here

Did you find this page helpful?