Troubleshoot
Common issues and solutions when configuring OpenClaw to use OLLM as a model provider.
This guide helps diagnose and resolve common issues when configuring OpenClaw to use OLLM as a model provider.
If your agents are not responding correctly, failing to start, or returning errors, work through the sections below to isolate the issue.
Configuration Issues
OpenClaw Does Not Recognize the ollm Provider
If OpenClaw fails to start or logs an error indicating the provider is unknown, verify that:
- The
providers.ollmblock exists inopenclaw.json - The JSON structure is valid
- There are no trailing commas or formatting errors
Validate your configuration file:
cat ~/.openclaw/openclaw.json | jq .If this command fails, your JSON file contains syntax errors.
Environment Variable Not Detected
If OpenClaw reports:
No API key found for provider 'ollm'It means OpenClaw cannot resolve ${OLLM_API_KEY}.
Check your environment variable:
echo $OLLM_API_KEYIf nothing prints:
- Ensure you exported the variable
- Reload your shell
- Restart OpenClaw
Alternatively, temporarily hardcode the key (for testing only) to confirm the issue is environment-related.
Authentication Errors (401 / 403)
If your agents return authentication failures:
401 Unauthorized
403 ForbiddenThis indicates that the request reached OLLM but was rejected.
Common causes:
- Invalid or revoked API key
- Incorrect base URL
- Misconfigured auth profile
Verify:
baseUrlis exactly:https://api.ollm.com/v1- The API key is valid in your OLLM dashboard
- The key has not been rotated or deleted
If using auth profiles:
openclaw auth listEnsure your profile exists and is correctly referenced:
"apiKey": "auth:ollm:default"Model Not Found
If OpenClaw returns an error indicating that a model is unavailable, check the following:
- The model ID is spelled correctly (e.g.,
near/GLM-4.6) - The model is defined in:
models.providers.ollm.models- The agent references the model using:
ollm/<model-id>For example:
ollm/near/GLM-4.6If the model exists in OLLM but not in your openclaw.json, OpenClaw will not expose it.
Agent Starts but Does Not Respond
If OpenClaw starts successfully but messages receive no response:
- Check OpenClaw logs for API errors
- Confirm the selected model is valid
- Ensure the agent has a defined
primarymodel
Example:
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.6"
}
}
}If primary is missing or misconfigured, the agent will not know which model to call.
Fallback Models Not Working
If you configured fallbacks but they are not triggered:
- Confirm fallback models are listed under
models - Ensure fallback IDs match exactly
- Check that the primary model is actually failing
Fallback configuration example:
"model": {
"primary": "ollm/near/GLM-4.6",
"fallbacks": ["ollm/near/GLM-4.7"]
}If the primary model succeeds, fallback will not execute.
Gateway Restart Issues
If changes to openclaw.json do not take effect:
Restart the gateway:
openclaw gateway restartIf issues persist, stop and start the entire OpenClaw service.
Incorrect Base URL
If requests fail unexpectedly, verify the base URL is correct.
It must be:
https://api.ollm.com/v1Do not add:
- Trailing slashes beyond
/v1 /chat/completionsmanually (OpenClaw handles endpoints)
Rate Limit or Usage Issues
If agents begin failing intermittently:
- Check OLLM dashboard usage
- Confirm token consumption
- Ensure your account has sufficient quota
You can monitor:
- Total requests
- Token usage
- Model usage
- Verification status
Auth Profile Not Working
If using auth profiles and authentication fails:
- Confirm the profile exists:
openclaw auth list- Re-set the key:
openclaw auth set ollm:default --key "$OLLM_API_KEY"- Ensure the provider config references:
"apiKey": "auth:ollm:default"If the reference is incorrect, OpenClaw will not resolve the key.
Per-Channel Model Misconfiguration
If a specific channel (e.g., Telegram or Discord) fails:
- Ensure the model is defined inside that channel’s configuration
- Confirm there is no override conflict
Example:
"telegram": {
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.6"
}
}
}
}If the global default is overridden incorrectly, behavior may differ per channel.
Debugging Checklist
Before escalating an issue, verify:
openclaw.jsonis valid JSONbaseUrlis correctOLLM_API_KEYis set or auth profile is configured- The model ID is defined and referenced correctly
- The gateway has been restarted
- OLLM dashboard shows incoming requests
This sequence isolates whether the issue is:
- Local configuration
- Authentication
- Model selection
- Or service-level failure