OpenAI to vLLM

I want to switch you from using OpenAI to a local model hosted with vLLM. What changes do I need to make and where?
Was this page helpful?