Expert guidance for working with the apiagents FastAPI service - an LLM-friendly tools platform with Arachne autonomous research APIs, Celery background workers, and Python sandbox execution.
You are working with the **apiagents** repository - a FastAPI service that exposes LLM-friendly tools and Arachne autonomous research APIs.
This is a FastAPI service designed for:
**Main entrypoint:** `src/main.py` - loads environment, initializes Postgres + legacy SQLite, registers routers, instruments with Logfire.
Located in `src/routers/jobs.py`:
**Phase 1 - Plan Negotiation:**
**Phase 2 - Execution:**
Located in `src/routers/chat.py`:
From repository root:
```bash
docker compose up -d --build apiagents celery-worker
```
```bash
./run.sh
uvicorn src.main:app --reload --port 8359
```
- Postgres models: `src/db/models.py`
- Session helpers: `src/db/session.py`
- FastAPI dependency: `get_db`
All background tasks must be:
1. Importable by Celery worker
2. Registered in `src/tasks/celery_app.py`
3. Routed to the `arachne` queue
For trace continuity across Celery tasks:
**When enqueueing:**
```python
from src.core.trace_propagation import make_celery_headers
task.apply_async(
args=[...],
headers=make_celery_headers({...})
)
```
**In task implementation:**
```python
from src.core.trace_propagation import attached_otel_context_from_headers
@celery.task
def my_task(**kwargs):
headers = kwargs.get('headers', {})
with attached_otel_context_from_headers(headers):
# Task logic with trace context
pass
```
Use `src/services/python_executor.py` (`PythonExecutor` class) for secure code execution.
**Requirements:**
Always use structured logging:
```python
from src.core.logging import get_logger
logger = get_logger(__name__)
logger.info("Message", extra={"key": "value"})
```
**Never use `print()` statements in production code.**
```bash
./run_tests.sh # Run all tests
./run_tests.sh --unit # Unit tests only
./run_tests.sh --integration # Integration tests only
./run_tests.sh --new # Recently modified tests
./run_tests.sh --quick # Fast subset
./run_tests.sh --coverage # With coverage report
```
```bash
ruff check . # Lint
ruff format . # Format
```
Configuration in `pyproject.toml` (Python 3.13 target).
1. **Routers:** Keep route handlers thin - delegate to service layer
2. **Schemas:** Use Pydantic v2 for all request/response models
3. **Database:** Always use the `get_db` dependency for sessions
4. **Async:** Prefer async/await for I/O-bound operations
5. **Background Tasks:** Use Celery for long-running operations
6. **Tracing:** Always propagate trace context to background tasks
7. **Logging:** Use structured logging with context
8. **Security:** Never execute untrusted code outside the sandbox
9. **Testing:** Write tests for new features and bug fixes
10. **Code Style:** Follow Ruff's recommendations
1. Create router in `src/routers/my_router.py`
2. Define Pydantic schemas in `src/schemas/my_schemas.py`
3. Register router in `src/main.py`
4. Add tests in `tests/routers/test_my_router.py`
1. Define task in `src/tasks/my_tasks.py`
2. Import in `src/tasks/celery_app.py`
3. Enqueue with trace headers
4. Handle context in task implementation
1. Define model in `src/db/models.py`
2. Use `get_db` dependency in route
3. Handle sessions properly (commit/rollback)
4. Write migrations if needed
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/github-copilot-instructions-for-apiagents/raw