LM Studio provider config causes "Cannot read properties of undefined (reading 'trim')"
Environment:
- Clawdbot 2026.1.15
- LM Studio 0.3.39 (Build 2)
- macOS, M4 Max
- Model: meta-llama-3.1-8b-instruct
The issue:
Adding this provider config causes the trim error on ALL requests (Claude included):
Confirmed working:
- LM Studio responds correctly to direct curl requests (curl http://localhost:1234/v1/chat/completions)
- Removing the lm-studio provider entirely โ Claude works fine
Tried:
- Clearing ~/.clawdbot/agents/*/agent/models.json
- Adding "apiType": "openai"
- Daemon restarts
Seems like the provider config itself is breaking something during initialization, not just when the model is called. What's the correct way to configure LM Studio?
(note: claude code wrote this for me
pi-coding-agent library (used by Clawdbot). When you define custom models in models.json, the library's ModelRegistry.validateConfig() function requires:1.
apiKey - Must be present when defining custom models (this is a hard requirement)2.
baseUrl - Required when defining custom models 3.
api - Should specify the API type (e.g., openai-completions)The "Cannot read properties of undefined (reading 'trim')" error likely occurs because:
- The library tries to process the incomplete config
- Without an
apiKey, some internal validation or processing path fails when trying to call .trim() on an undefined valueAlternative: Override-only config
If you just want to override the base URL for a built-in provider without defining custom models, use this simpler format:
