Configure Aider to use different model tiers (strong/edit/weak) via OpenRouter or local Ollama, with git integration and custom settings for Docker builds.
Configure Aider with a three-tier model strategy: strong reasoning models for complex changes, fast edit models for quick fixes, and lightweight weak models for commit messages. Supports both OpenRouter (free models) and local Ollama deployment.
Sets up an `.aider.conf.yaml` file that configures Aider to use:
1. **Verify Aider is installed**:
- Check if Aider CLI is available: `aider --version`
- If not installed, install via pip: `pip install aider-chat`
2. **Create the configuration file**:
- Create `.aider.conf.yaml` in your project root
- Copy the multi-model configuration template (see below)
3. **Set up API keys**:
- For OpenRouter models: `export OPENROUTER_API_KEY=your_key_here`
- Get a free API key from https://openrouter.ai/keys
- For local Ollama (optional): `export OLLAMA_API_BASE=http://127.0.0.1:11434`
4. **Choose your model tier**:
- **Free OpenRouter models** (recommended):
- Strong: `openrouter/tngtech/deepseek-r1t2-chimera:free`
- Edit: `openrouter/tngtech/deepseek-r1t2-chimera:free`
- Weak: `openrouter/mistralai/devstral-small-2505:free`
- **Local Ollama models** (for offline use):
- Uncomment the `model: ollama_chat/...` lines
- Start Ollama: `ollama serve`
- Change `openai-api-base` to `http://localhost:11434`
5. **Configure git integration**:
- Auto-commits are **disabled** by default (`auto-commits: false`)
- Create `.aiderignore` file to exclude sensitive files
- Git integration enabled by default (`git: true`)
6. **Alternative model options**:
- **Strong models**: DeepSeek R1 (various versions), OpenRouter Cypher Alpha
- **Edit models**: DeepSeek R1 Qwen3-8B
- **Weak models**: Mistral 7B Instruct, DeepSeek Qwen3-8B
- See commented options in the configuration file
7. **Run Aider with the configuration**:
- Start Aider: `aider`
- Aider will automatically load `.aider.conf.yaml`
- Verify model configuration: Check startup messages
8. **Configuration template**:
```yaml
openai-api-base: https://openrouter.ai/api/v1/chat/completions
model: openrouter/tngtech/deepseek-r1t2-chimera:free
editor-model: openrouter/tngtech/deepseek-r1t2-chimera:free
weak-model: openrouter/mistralai/devstral-small-2505:free
git: true
gitignore: false
aiderignore: .aiderignore
auto-commits: false
```
**For a Docker build project**:
```bash
export OPENROUTER_API_KEY=sk-or-v1-...
cat > .aider.conf.yaml <<EOF
openai-api-base: https://openrouter.ai/api/v1/chat/completions
model: openrouter/tngtech/deepseek-r1t2-chimera:free
editor-model: openrouter/tngtech/deepseek-r1t2-chimera:free
weak-model: openrouter/mistralai/devstral-small-2505:free
git: true
auto-commits: false
aiderignore: .aiderignore
EOF
echo "*.env\n.secrets/\n*.key" > .aiderignore
aider
```
**For local Ollama development**:
```bash
export OLLAMA_API_BASE=http://127.0.0.1:11434
ollama serve &
ollama pull qwen2.5:7b
ollama pull gemma3:4b
aider
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/multi-model-aider-configuration-with-openrouter-and-local-ollama/raw