Use OpenClaw With OLLM
How to configure OpenClaw to use OLLM as a model provider for secure, verifiable agent interactions.
This guide explains how to configure OpenClaw to use OLLM as its model provider.
By configuring OLLM as your backend, all OpenClaw agent interactions will be routed through OLLM’s OpenAI-compatible API endpoint.
What is OpenClaw?
OpenClaw is an open-source AI agent platform that enables conversational AI across messaging platforms such as Telegram, Discord, Slack, Signal, iMessage, and WhatsApp.
It supports multiple LLM providers through a provider configuration system. OLLM can be configured as a custom OpenAI-compatible provider.
Setup
Because OLLM exposes an OpenAI-compatible API, it must be added manually as a provider inside your openclaw.json configuration.
Step 1: Get Your OLLM API Key
-
Log in to your OLLM dashboard.
-
Navigate to the API Keys section.
-
Generate a new API key.
-
Copy the key securely.
Step 2: Configure OLLM in openclaw.json
Edit your ~/.openclaw/openclaw.json file and add OLLM as a provider.
{
"env": {
"OLLM_API_KEY": "your-api-key"
},
"models": {
"mode": "merge",
"providers": {
"ollm": {
"baseUrl": "https://api.ollm.com/v1",
"apiKey": "${OLLM_API_KEY}",
"api": "openai-completions",
"models": [
{ "id": "near/GLM-4.6", "name": "GLM 4.6" },
{ "id": "near/GLM-4.7", "name": "GLM 4.7" }
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.6"
},
"models": {
"ollm/near/GLM-4.6": {}
}
}
}
}Alternatively, set your API key as an environment variable:
export OLLM_API_KEY="your-api-key"OpenClaw will now recognize the ollm provider and route model requests to https://api.ollm.com/v1.
Step 3: Choose Your Model
Update the primary model under agents.defaults.model.
For example:
"model": {
"primary": "ollm/near/GLM-4.7"
}Ensure that the selected model ID matches one available in your OLLM account.
Step 4: Restart OpenClaw
After updating your configuration, restart OpenClaw:
openclaw gateway restart
Your agents will now send requests to OLLM using the selected model.
Model Format
OpenClaw uses the format:
ollm/<model-id>\Example:
ollm/near/GLM-4.6Any model exposed by OLLM can be added to the models.providers.ollm.models array and referenced using this format.
Multiple Models with Fallbacks
OpenClaw supports fallback models if the primary model becomes unavailable.
You can configure this as follows:
{
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.6",
"fallbacks": [
"ollm/near/GLM-4.7"
]
},
"models": {
"ollm/near/GLM-4.6": {},
"ollm/near/GLM-4.7": {}
}
}
}
}If the primary model fails, OpenClaw will attempt the fallback in order.
Using Auth Profiles (Recommended)
Instead of storing API keys directly in openclaw.json, you can use OpenClaw’s auth profiles.
Add the following:
{
"auth": {
"profiles": {
"ollm:default": {
"provider": "ollm",
"mode": "api_key"
}
}
}
}Then store your key securely:
openclaw auth set ollm:default --key "$OLLM_API_KEY"Update your provider config to reference the profile:
"providers": {
"ollm": {
"apiKey": "auth:ollm:default"
}
}This keeps your API key out of plaintext configuration files.
Monitoring Usage
You can track your OLLM usage in the OLLM dashboard, including:
- Total requests
- Token usage
- Model usage
- Verification status
Common Errors
No API key found for provider 'ollm'
OpenClaw cannot find your OLLM API key.
Fix:
- Verify
echo $OLLM_API_KEY - Confirm
${OLLM_API_KEY}is referenced correctly - Ensure your auth profile is configured properly
Authentication Errors (401 / 403)
If you encounter authentication failures:
- Confirm your API key is valid
- Verify the base URL is
https://api.ollm.com/v1 - Ensure your key has not been revoked
Model Not Found
If a model fails to load:
- Confirm the model ID is correct
- Ensure it is defined under
models.providers.ollm.models - Reference it using
ollm/<model-id>
Advanced Configuration
Per-Channel Models
You can configure different models per messaging channel:
{
"telegram": {
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.6"
}
}
}
},
"discord": {
"agents": {
"defaults": {
"model": {
"primary": "ollm/near/GLM-4.7"
}
}
}
}
}Resources
- OpenClaw Documentation
- OLLM Documentation
- OLLM Dashboard