Self-host: unable to scrape/crawl, "Unauthorized" error
Hi all, I'm trying to self-host on a Ubuntu system. Every time I run a crawl or scrape cURL request, I get {"error":"Unauthorized"}{base}, regardless of whatever URL I'm trying to crawl/scrape.
My environment.env is basically a copy/paste of the example, except I've set USE_DB_AUTHENTICATION=false, since I'm not using supabase.
I've also opened port 6379 and 3002 in my firewall, so I can't see why there would be any permissions issues.
Anybody with suggestions on what I should do next?
1 Reply
ccing @rafaelmiller here