Using LM Studio as Provider

What's missing here so that LM Studio can be used as backup?

Current: anthropic/claude-opus-4-5
Default: anthropic/claude-opus-4-5
Agent: main
Auth file: ~/.clawdbot/agents/main/agent/auth-profiles.json
(previous selection reset to default)

[anthropic] endpoint: default auth: anthropic:claude-cli=OAuth (next, lastGood, exp 365d) (auth-profiles.json: ~/.clawdbot/agents/main/agent/auth-profiles.json)
โ€ข anthropic/claude-opus-4-5 (opus)
โ€ข anthropic/claude-sonnet-4-5 (sonnet)

[google] endpoint: default auth: google:ics=AIzaSyBI...waGQw9yg (next), google:clawdbot=AIzaSyAT...iXLSGnAo (auth-profiles.json: ~/.clawdbot/agents/main/agent/auth-profiles.json)
โ€ข google/gemini-3-pro-preview (gemini)

[openai-codex] endpoint: default auth: openai-codex:oliver=OAuth (next, exp 10d), openai-codex:sylvia=OAuth (lastGood, exp 3d) (auth-profiles.json: ~/.clawdbot/agents/main/agent/auth-profiles.json)
โ€ข openai-codex/gpt-5.2 (gpt)

[lm-studio] endpoint: default auth: missing
โ€ข lm-studio/qwen3-4b-mlx (qwen3)


config:

 "models": {
    "mode": "merge",
    "providers": {
      "lmstudio": {
        "baseUrl": "http://192.168.1.142:1234/v1",
        "apiKey": "lm-studio",
        "api": "openai-completions",
        "models": [
          {
            "id": "qwen3-4b-mlx",
            "name": "qwen3-4b-mlx",
            "reasoning": false,
            "input": [
              "text"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 32768,
            "maxTokens": 8192
          }
        ]
      }
    }
  },
Was this page helpful?