Multi-headed AI coding assistant with smart routing, MCP support, persistent memory, and Telegram bot integration. Supports OpenAI, Anthropic, Groq, Together, Ollama, and more.
A multi-headed AI coding assistant that provides intelligent model routing, MCP server integration, persistent conversation memory, and cross-platform access via CLI and Telegram.
HYDRACODE is a comprehensive AI coding assistant that automatically routes tasks to appropriate models based on complexity, connects to Model Context Protocol servers for extended capabilities, maintains persistent memory across sessions, and offers both command-line and Telegram bot interfaces. It supports multiple LLM providers and includes a rich tool system for file operations, bash commands, and web search.
1. **Install HYDRACODE globally**
```bash
npm install -g hydracode-cli
```
2. **Run initial setup**
```bash
hydracode
```
The setup wizard will guide you through:
- Selecting your LLM provider
- Entering your API key
- Choosing a default model
3. **Alternative: Quick start with environment variable**
```bash
OPENAI_API_KEY=sk-xxx hydracode
```
Configure your preferred AI provider:
```bash
OPENAI_API_KEY=sk-xxx hydracode
GROQ_API_KEY=gsk_xxx hydracode
ANTHROPIC_API_KEY=sk-ant-xxx hydracode
hydracode config --api-key YOUR_KEY --provider groq
```
Enable intelligent task routing to different models:
```bash
/routermode
```
Router complexity levels:
Enable conversation memory across sessions:
```bash
/memory on # Enable persistent memory
/memory # Check memory status
/memory off # Disable memory
```
Set custom behavior instructions:
```bash
/bio
```
1. **List available presets**
```bash
/mcp presets
```
2. **Add a preset server**
```bash
/mcp add brave-search # Web search
/mcp add github # GitHub API
/mcp add filesystem # File operations
/mcp add puppeteer # Browser automation
/mcp add postgres # PostgreSQL access
```
3. **Configure API keys for servers**
```bash
/mcp env brave-search BRAVE_API_KEY=xxx
/mcp env github GITHUB_TOKEN=ghp_xxx
/mcp env postgres POSTGRES_CONNECTION_STRING=postgresql://...
```
4. **Connect to servers**
```bash
/mcp connect
```
5. **List available tools**
```bash
/mcp tools
```
```bash
/mcp add myserver python /path/to/server.py
/mcp add custom-node node /path/to/server.js
```
| Preset | Description | Required API Key |
|--------|-------------|------------------|
| `brave-search` | Web search capabilities | BRAVE_API_KEY |
| `github` | GitHub API integration | GITHUB_TOKEN |
| `filesystem` | Local file access | None |
| `puppeteer` | Browser automation | None |
| `fetch` | HTTP requests | None |
| `sqlite` | SQLite database | None |
| `postgres` | PostgreSQL database | POSTGRES_CONNECTION_STRING |
| `slack` | Slack workspace integration | SLACK_BOT_TOKEN |
Access your AI assistant from anywhere via Telegram:
1. **Initialize bot setup**
```bash
# Inside HYDRACODE
/gateway setup
```
2. **Follow prompts to configure**
- Create a bot via BotFather on Telegram
- Enter your bot token
- Set authorized user IDs
3. **Start the bot server**
```bash
/serve
```
Or start directly:
```bash
hydracode serve
```
4. **Use from Telegram**
- Message your bot
- Use all HYDRACODE commands via Telegram
- Persistent sessions across devices
```bash
hydracode
> Create a Python web scraper that extracts headlines from news sites and saves them to CSV
```
```bash
hydracode
/mcp add brave-search
/mcp env brave-search BRAVE_API_KEY=your_key
/mcp connect
> Search for the latest developments in quantum computing and summarize the findings
```
```bash
/mcp add github
/mcp env github GITHUB_TOKEN=ghp_xxx
/mcp connect
> Show me the open issues in the repository "owner/repo" with the label "bug"
```
```bash
/routermode
> Build a full-stack todo app with React frontend, Node.js backend, and PostgreSQL database. Include authentication.
```
(Router automatically uses HIGH complexity model for this task)
```bash
/gateway setup
/serve
```
| Provider | Models | Best For |
|----------|--------|----------|
| **OpenAI** | gpt-4o, gpt-4o-mini, o1-preview | General coding, reasoning |
| **Groq** | llama-3.3-70b, mixtral-8x7b | Fast responses, free tier |
| **Anthropic** | claude-3.5-sonnet, claude-3-opus | Code generation, analysis |
| **Together** | Llama, Mistral, Qwen | Open models, cost-effective |
| **Ollama** | Any local model | Privacy, offline use |
| **LM Studio** | Any local model | Local with GUI |
1. **Start with Groq** for fast, free experimentation
2. **Enable router mode** for cost-effective multi-task workflows
3. **Use memory** for ongoing projects to maintain context
4. **Set custom bio** for specialized domains (e.g., "You are a Python expert focused on data science")
5. **Connect MCP servers** only when needed to reduce overhead
6. **Use Telegram bot** for quick questions on the go
7. **Switch models** based on task: mini for simple, opus for complex
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/hydracode-multi-provider-ai-coding-assistant/raw