Guidance for developing in the Azure OpenAI Python/React chat application with Quart backend, Vite frontend, and Azure service integrations.
Expert guidance for working in an Azure OpenAI-powered chat application with Python (Quart) backend and TypeScript/React (Vite) frontend.
**Backend:**
**Frontend:**
**Key endpoints:**
1. **Copy environment template:**
```bash
cp .env.sample .env
```
2. **Configure required variables in `.env`:**
- `AZURE_OPENAI_MODEL`
- `AZURE_OPENAI_ENDPOINT` or `AZURE_OPENAI_RESOURCE`
- `AZURE_OPENAI_KEY` (or Entra ID environment variables)
- Optional: `AZURE_COSMOSDB_*`, `AZURE_SEARCH_*`, etc.
3. **Run the application:**
- **Windows:** `start.cmd`
- **Linux/Mac:** `start.sh`
These scripts build the frontend, install backend dependencies, and start the Quart app.
4. **Access the app:** http://127.0.0.1:50505
**Backend-only development:**
```bash
python -m app
```
(Requires frontend static files already built)
**Frontend development:**
```bash
cd frontend
npm install
npm run dev
```
Built artifacts are copied to `static/` for production.
**Run tests:**
```bash
pip install -r requirements-dev.txt
pytest
```
**Single source of truth:** `.env` file
**Settings module:** `backend/settings.py`
**Key settings:**
1. Frontend calls `/conversation` or `/history/*`
2. Backend constructs model payload via `prepare_model_args()` in `app.py`
3. If datasource configured, `extra_body.data_sources` is added to payload
4. When `AZURE_OPENAI_STREAM=true`, responses stream as NDJSON
5. When function calling enabled (`AZURE_OPENAI_FUNCTION_CALL_AZURE_FUNCTIONS_ENABLED=true`), backend fetches and executes remote function definitions
**Pattern:** Implement `DatasourcePayloadConstructor` subclass
**Method to override:**
```python
def construct_payload_configuration(request=...) -> dict
```
**Examples:** See `_AzureSearchSettings`, `_ElasticsearchSettings` in `backend/settings.py`
**Supported datasources:**
Both modes supported. Implementation in `app.py`:
**Format helpers:** `backend/utils.py`
Keep formatting consistent across both modes when making changes.
**When enabled:**
**Configuration:** `AZURE_OPENAI_FUNCTION_CALL_AZURE_FUNCTIONS_ENABLED`
**Secrets masking:** `prepare_model_args()` in `app.py` creates `model_args_clean` that hides:
**Logging:** Only masked payloads are logged. Never log raw secrets.
| Purpose | Location |
|---------|----------|
| Request/response construction | `app.py` (`prepare_model_args`, `send_chat_request`, `stream_chat_request`) |
| Settings and env parsing | `backend/settings.py` |
| Conversation storage | `backend/history/cosmosdbservice.py` |
| Utilities and formatters | `backend/utils.py` (NDJSON, streaming, secret masking) |
| Frontend UI and API calls | `frontend/src/` (e.g., `Answer/Answer.tsx` for citations/feedback) |
**Core dependency:** Azure OpenAI (`AZURE_OPENAI_*` env vars)
**Optional services:**
1. Add to appropriate Settings class in `backend/settings.py`
2. Use `model_validator` or `field_validator` for transformations
3. Document in `.env.sample`
1. Create `DatasourcePayloadConstructor` subclass in `backend/settings.py`
2. Implement `construct_payload_configuration(request=...)`
3. Add to `datasource_type` enum
4. Test payload construction in isolation
1. Update both streaming and non-streaming helpers in `backend/utils.py`
2. Verify frontend parsing in `frontend/src/`
3. Test with network tab in browser devtools
⚠️ **Handle carefully** - Pydantic validators accept both strings and JSON-encoded env vars (see `deserialize_tools`, `split_contexts`)
After making changes:
1. ✅ Update `.env` with required keys from `.env.sample`
2. ✅ Run `start.cmd` (Windows) or `start.sh` (Linux/Mac)
3. ✅ Verify app serves at http://127.0.0.1:50505
4. ✅ Test `/conversation` endpoint via browser devtools network tab
5. ✅ If history logic changed, test `/history/ensure` for CosmosDB connectivity
6. ✅ Run `pytest` for regression checks
**Frontend not loading:**
**Azure OpenAI connection errors:**
**Streaming not working:**
Keep `.github/copilot-instructions.md` updated when:
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/azure-openai-chat-app-development/raw