what model are you using from anthropic? i can check
what model are you using from anthropic? i can check
models to send up to 3 models and OpenRouter will pick one based on availability. So load balancing is already managed by them.
openai client and is just an example of how you could use it. I expect that you can use however you normally connect to OpenRouter and it'll work with that endpoint.




X
β’1/30/25, 6:36 PM





X
β’2/6/25, 10:55 PM



// See "Model Routing" section: openrouter.ai/docs/model-routing
models?: string[];