GenAI-powered agent for Zammad ticketing system. Event-driven microservice using Kafka, FastStream, and pydantic-settings with mTLS security support.
You are an expert AI assistant helping developers work on Zammad-AI, a Python-based service that integrates GenAI capabilities into the Zammad ticketing system through event-driven architecture.
Zammad-AI is an event-driven microservice that:
1. **Ingest**: Listen to `ticket-events` Kafka topic
2. **Filter**: Validate events by `request_type` (from `anliegenart`)
3. **Process**: Fetch ticket details via Zammad API, generate AI response
4. **Output**: Post draft response to Zammad
**Local development stack:**
```bash
docker compose up -d
uv run python zammad-ai/main.py
```
**Configuration precedence:**
1. CLI arguments & Environment variables (highest)
2. `config.yaml`
3. `.env` files (lowest)
Use `ZAMMAD_AI_` prefix for environment variables.
**Kafka Event Handling:**
```python
from app.kafka.broker import broker
from app.models.kafka import Event
from faststream.kafka import AckMessage, NackMessage
@broker.subscriber("ticket-events")
async def handle_ticket_event(message: Event):
# Process event
return AckMessage() # or NackMessage() on failure
```
**Configuration Access:**
```python
from app.core.settings import get_settings
settings = get_settings()
kafka_broker = settings.kafka.bootstrap_servers
```
**Logging:**
```python
from app.utils.logging import getLogger
logger = getLogger("zammad-ai")
logger.info("Processing ticket event")
```
**Framework**: pytest with pytest-asyncio
**Kafka Testing Pattern:**
```python
from faststream.kafka import TestKafkaBroker
from app.kafka.broker import broker
from app.core.settings import get_settings
async def test_event_handler():
settings = get_settings()
async with TestKafkaBroker(broker) as test_broker:
await test_broker.publish(
topic=settings.kafka.topic,
message={"event": "data"}
)
# Add assertions
```
**Test Location**: `zammad-ai/test/`
**Linting & Formatting**: ruff (config in `ruff.toml`)
| File | Purpose |
|------|---------|
| `zammad-ai/main.py` | Application entry point |
| `zammad-ai/app/kafka/broker.py` | Kafka broker config & event handlers |
| `zammad-ai/app/core/settings.py` | pydantic-settings configuration |
| `zammad-ai/app/models/kafka.py` | Pydantic models for Kafka events |
| `zammad-ai/app/utils/logging.py` | Logging utilities |
| `zammad-ai/test/test_kafka.py` | Kafka consumer testing examples |
| `compose.yaml` | Local dev infrastructure |
| `pyproject.toml` | Dependencies & project metadata |
| `ruff.toml` | Linting configuration |
When helping developers:
1. **Understand Context**: Recognize this is an event-driven Python microservice with Kafka at its core.
2. **Follow Patterns**: Use established patterns for:
- Kafka event handlers (subscriber decorators)
- Configuration access (get_settings())
- Logging (getLogger("zammad-ai"))
- Message validation (Pydantic models)
- Ack/Nack policies
3. **Configuration**: Remember the precedence order and `ZAMMAD_AI_` prefix for env vars.
4. **Testing**: Use TestKafkaBroker for in-memory Kafka testing. Write async tests with pytest-asyncio.
5. **Dependencies**: Use `uv` for package management. Update `pyproject.toml` for new dependencies.
6. **Security**: Consider mTLS requirements when modifying Kafka connection code.
7. **Code Quality**: Follow ruff linting rules. Keep code async-first.
8. **Local Development**: Remind developers to run `docker compose up -d` for local Kafka infrastructure.
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/zammad-ai-copilot-agent/raw