llm_extraction feature

Hello ! is there a way I can use a different model in the llm_extraction feature other than openai ? (trying to use groq)
Was this page helpful?