Error: Supabase Client Not Configured in Self-Hosted Firecrawl
I'm currently self-hosting Firecrawl and running into an issue with configuring the Supabase client. I'm using the example code from the documentation:
import uuid from firecrawl.firecrawl import FirecrawlApp
Despite setting all necessary variables like the OpenAI key, model name, and URL, I get the following error:
HTTPError: Internal Server Error: Failed to start crawl job. Supabase client is not configured.
And the server logs show:
[2024-08-11T13:37:00.209Z] WARN - You're bypassing authentication api_1 | [2024-08-11T13:37:00.210Z] ERROR - Error: Supabase client is not configured.
I've verified that the server URL is correct (http://localhost:3002/). It seems like there's an issue with the Supabase client configuration, but I'm not sure what exactly is missing or misconfigured.
I am aware that in the note of SELF_HOST.md you have stated that Supabase cannot be configured but that I can still scrape. However, in my case, I am unable to crawl as well.
Could someone help me figure out how to properly configure the Supabase client for Firecrawl in a self-hosted environment? Any advice or pointers would be greatly appreciated!
Join builders, developers, users and turn any website into LLM-ready data, enabling developers to power their AI applications with clean, structured information crawled from the web.