Aider configuration for local Ollama-powered development with Qwen2.5 32B model, optimized for self-hosted LLM infrastructure on Mac Studio
This skill configures Aider to work with a self-hosted Ollama instance running Qwen2.5 32B, designed for local LLM development infrastructure on Mac Studio.
Create a `.aider.conf.yml` file in your project root with the following settings:
```yaml
model:
# Default model to use with Aider
name: "qwen2.5:32b"
# API base URL (uses docker network name)
api_base: "http://ollama:11434"
# Request timeout in seconds
timeout: 600
editor:
# Default editor to use
command: "vim"
# Whether to use dark mode
dark_mode: true
git:
# Auto-commit changes
auto_commit: false
# Commit message prefix
commit_prefix: "feat: "
chat:
# Show model name in chat
show_model: true
# Show cost estimates
show_cost: false
# Enable streaming responses
stream: true
# Show diff before applying changes
show_diffs: true
files:
# Maximum file size to edit (in bytes)
max_file_size: 1048576
# File encoding
encoding: "utf-8"
# Auto-add files mentioned in chat
auto_add: true
output:
# Format for code blocks
code_format: "markdown"
# Show line numbers in diffs
show_line_numbers: true
```
1. **Install Ollama**:
```bash
# For Mac
brew install ollama
```
2. **Pull Qwen2.5 Model**:
```bash
ollama pull qwen2.5:32b
```
3. **Start Ollama Service**:
```bash
# If using Docker Compose
docker-compose up -d ollama
# Or run locally
ollama serve
```
4. **Install Aider**:
```bash
pip install aider-chat
```
5. **Place Configuration**:
- Save the YAML configuration as `.aider.conf.yml` in your project root
- Aider will automatically detect and use it
6. **Verify Connection**:
```bash
aider --model qwen2.5:32b
```
```bash
aider
aider --model qwen2.5:32b
aider src/main.py src/utils.py
```
```bash
/add path/to/file.py
/diff
/commit
/clear
```
If running Ollama in Docker:
```yaml
model:
name: "codellama:34b" # Or any Ollama-compatible model
```
```yaml
editor:
command: "code --wait" # VS Code
# or
command: "nano" # Nano
```
```yaml
git:
auto_commit: true
commit_prefix: "ai: "
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/self-hosted-llm-aider-configuration/raw