Expert assistant for Tel-Insights microservices architecture - handles Telegram aggregation, AI analysis, MCP server, and alerting service development
Expert assistant for working with the Tel-Insights microservices architecture - a Telegram monitoring and AI analysis platform with async message queue communication.
Tel-Insights uses a **microservices architecture** with **RabbitMQ** for async communication:
```
Aggregator → RabbitMQ → AI Analysis → PostgreSQL
↓ ↓ ↑
PostgreSQL Smart Analysis ←──┘
↑ ↓
Alerting ←─────────────── MCP Tools
```
1. **Aggregator** (`src/aggregator/`): Telegram client using Telethon, monitors channels, publishes to queue
2. **AI Analysis** (`src/ai_analysis/`): Queue consumer, processes messages with Gemini LLM, stores metadata
3. **Smart Analysis** (`src/smart_analysis/`): MCP server, frequency-based alerts, news summarization
4. **Alerting** (`src/alerting/`): Telegram bot using python-telegram-bot (TODO: not yet implemented)
Always use the service runner for starting services:
```bash
python run_service.py aggregator # Telegram message collector
python run_service.py ai-analysis # AI processing with Gemini
python run_service.py smart-analysis # MCP server and alerts
python run_service.py alerting # Telegram bot interface
./run_service.sh <service> # Linux/macOS
run_service.bat <service> # Windows
```
Run tests using pytest with appropriate markers:
```bash
pytest
pytest -m unit
pytest -m integration
pytest -m e2e
pytest --cov=src --cov-report=html
pytest tests/unit/test_models.py
```
**Test Structure:**
Use Alembic for database migrations:
```bash
alembic upgrade head
alembic revision --autogenerate -m "Description of changes"
alembic downgrade -1
```
Always format and lint code before committing:
```bash
black src/
isort src/ # Black profile
flake8 src/
mypy src/
```
**Important:** Use these exact commands. Black is configured for 88-character line length and isort uses Black profile.
Messages store rich AI analysis in PostgreSQL JSONB (GIN indexed):
```json
{
"summary": "Brief message summary",
"topics": ["technology", "AI"],
"sentiment": "positive",
"entities": {"organizations": ["OpenAI"], "locations": ["SF"]},
"keywords": ["AI", "breakthrough"],
"confidence_score": 0.95
}
```
Smart Analysis runs an MCP server on port 8003 with tools:
Copy `config.env.template` to `.env` and configure:
Core tables:
1. **Before making changes**: Read relevant service code in `src/{service_name}/`
2. **Service modifications**: Understand the async message flow through RabbitMQ
3. **Database changes**: Create Alembic migration with `--autogenerate`
4. **Adding features**: Test with unit tests first, then integration tests
5. **Code style**: Always run `black` and `isort` before committing
6. **Service testing**: Start dependencies (PostgreSQL, RabbitMQ) before running services
7. **AI metadata**: Follow the established JSONB structure for consistency
8. **MCP tools**: Add new tools to `src/smart_analysis/mcp_server.py`
1. Add channel to `MONITORED_CHANNELS` in `.env`
2. Restart aggregator service
3. Channel automatically added to database on first message
1. Edit `src/ai_analysis/gemini_processor.py`
2. Update AI metadata structure if needed
3. Run unit tests: `pytest tests/unit/test_ai_analysis.py`
4. Test with integration: `pytest -m integration`
1. Add tool function to `src/smart_analysis/mcp_server.py`
2. Use FastMCP decorators for tool registration
3. Test via MCP client or Claude Desktop integration
1. Modify models in `src/shared/models.py`
2. Generate migration: `alembic revision --autogenerate -m "description"`
3. Review generated migration in `alembic/versions/`
4. Apply: `alembic upgrade head`
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/tel-insights-development-assistant/raw