Aider configuration optimized for local Ollama with devstral-small-2 model, featuring security-focused file restrictions, automatic git commits, and context management for 32K token window.
This skill configures Aider to work with a local Ollama instance running the devstral-small-2:24b model, with strict security controls and optimized context management.
Provides a production-ready Aider configuration that:
When deploying this configuration:
1. **Create the configuration file** at `.aider.conf.yml` in the project root with the following content:
```yaml
model: ollama_chat/devstral-small-2:24b
openai-api-base: http://127.0.0.1:11434/v1
map-tokens: 24000
max-chat-history-tokens: 8000
file:
- src/
- tests/
- scripts/
read:
- README.md
- docs/
- Cargo.toml
- Cargo.lock
- .env.example
git: true # Require Git repository
gitignore: true # Respect .gitignore
aiderignore: .aiderignore # Enforce .aiderignore
auto-commits: true
dirty-commits: true
auto-lint: true
pretty: true
dark-mode: true
stream: true
```
2. **Verify prerequisites**:
- Ollama is running locally on port 11434
- devstral-small-2:24b model is pulled (`ollama pull devstral-small-2:24b`)
- OLLAMA_CONTEXT_LENGTH environment variable is set to 32768
- Project is a Git repository
3. **Create .aiderignore file** if needed to exclude sensitive files:
```
.env
*.key
*.pem
secrets/
```
4. **Verify directory structure** matches the whitelist:
- Ensure `src/`, `tests/`, and `scripts/` directories exist
- Confirm read-only reference files are present
5. **Test the configuration**:
- Run `aider` in the project directory
- Verify it connects to Ollama successfully
- Confirm file restrictions are enforced
```bash
export OLLAMA_CONTEXT_LENGTH=32768
ollama serve
ollama pull devstral-small-2:24b
cd /path/to/project
aider
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/aider-local-ollama-configuration/raw