trymaple ai as llm

i am trying to plug in openai endpoint compatible llm (maple) via local proxy. how do i configure? (including what i enter in terminal)
Was this page helpful?