Configure Aider to use local Ollama instance with DeepSeek Coder model for privacy-focused AI pair programming with solarized dark theme
Configure Aider to use a local Ollama instance with the DeepSeek Coder model for completely private, offline AI pair programming. This setup keeps all code and interactions on your machine.
This configuration sets up Aider to use your local Ollama server instead of cloud APIs, enabling:
1. Install and run Ollama: `ollama serve`
2. Pull the DeepSeek Coder model: `ollama pull deepseek-coder:6.7b`
3. Verify Ollama is running on `http://localhost:11434`
1. **Create the Aider configuration file** in your project root or home directory:
Create `.aider.conf.yml`:
```yaml
model: ollama_chat/deepseek-coder:6.7b
openai-api-base: http://localhost:11434
dark-mode: true
code-theme: solarized-dark
stream: true
pretty: true
```
2. **Verify Ollama is running**:
```bash
curl http://localhost:11434/api/tags
```
3. **Start Aider** in your project directory:
```bash
aider
```
Aider will automatically detect and use the `.aider.conf.yml` configuration.
Once configured, use Aider normally:
```bash
aider src/main.py
/add src/utils.py
"Refactor the authentication function to use async/await"
/diff
```
To use a different Ollama model, first pull it:
```bash
ollama pull codellama:13b
```
Then update `.aider.conf.yml`:
```yaml
model: ollama_chat/codellama:13b
```
**Aider can't connect to Ollama:**
**Model not found:**
**Slow responses:**
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/aider-ollama-local-setup/raw