Build a Flask-based chat application with Redis persistence, Datadog monitoring, streaming responses, and modal-based UI. Follows LLM-2000 architecture with separate frontend/backend concerns.
This skill has safety concerns that you should review before use. Some patterns were detected that may pose a risk.Safety score: 60/100.
KillerSkills scans all public content for safety. Use caution before installing or executing flagged content.
Build a Flask-based chat application with Redis persistence, Datadog monitoring, streaming server-sent events (SSE), and a modal-based UI.
This skill helps you build a chat application called **LLM-2000** that:
Create the main Flask application with Datadog and Redis:
```python
from flask import Flask
from ddtrace import tracer
import redis
app = Flask(__name__)
```
**Key considerations:**
Build the core service for chat operations:
```python
class ChatService:
def __init__(self, redis_client):
self.redis = redis_client
def get_chat_history(self, user_id):
# HGETALL chat:{user_id}
pass
def save_message(self, user_id, role, content):
# HSET chat:{user_id} timestamp message
pass
def get_system_prompt(self, user_id):
# GET prompt:{user_id} or load default_prompt.txt
pass
def stream_response(self, user_id, message):
# Generate streaming response
pass
```
**Redis data structures:**
Define RESTful endpoints with Datadog tracing:
```python
from ddtrace import tracer
@app.route('/api/chat', methods=['GET', 'POST', 'DELETE'])
@tracer.wrap()
def chat():
# GET: Check existence, load history
# POST: Send message, return SSE stream
# DELETE: Clear history
pass
@app.route('/api/prompt', methods=['GET', 'POST'])
@tracer.wrap()
def prompt():
# GET: Load system prompt
# POST: Save system prompt
pass
```
**API endpoints:**
Create the API communication layer:
```javascript
// flask/app/static/js/services/ChatService.js
export class ChatService {
async checkChat(userId) {
// GET /api/chat?user_id=...
}
async sendMessage(userId, message) {
// POST /api/chat (returns EventSource)
}
async clearChat(userId) {
// DELETE /api/chat?user_id=...
}
async loadPrompt(userId) {
// GET /api/prompt?user_id=...
}
}
```
Build SSE handling and text buffering:
```javascript
// flask/app/static/js/stream/StreamProcessor.js
export class StreamProcessor {
constructor(eventSource, onToken, onComplete, onError) {
this.eventSource = eventSource;
// Set up message/error/close handlers
}
}
// flask/app/static/js/stream/TokenBuffer.js
export class TokenBuffer {
addToken(token) {
// Buffer tokens for smooth display
}
flush() {
// Display buffered text
}
}
```
Build state management and orchestration:
```javascript
// flask/app/static/js/ChatManager.js
export class ChatManager {
constructor(chatUI, chatService) {
this.ui = chatUI;
this.service = chatService;
}
async initialize(userId) {
// Check chat, load history
}
async sendMessage(message) {
// Create stream, process tokens
}
async clearChat() {
// Clear history, reset UI
}
}
```
Implement DOM interactions:
```javascript
// flask/app/static/js/ChatUI.js
export class ChatUI {
showUserMessage(message) {
// Append user message to chat
}
startAssistantMessage() {
// Create assistant message container
}
appendToAssistant(text) {
// Append text to current assistant message
}
disableInput() / enableInput() {
// Control input state during streaming
}
}
```
Create modular CSS with design tokens:
```css
/* flask/app/static/css/variables.css */
:root {
--purple: #6b46c1;
--beige: #f5f5dc;
--spacing-sm: 0.5rem;
--spacing-md: 1rem;
}
/* flask/app/static/css/components.css */
.button {
height: 40px;
/* Main action buttons */
}
.header-button {
height: 32px;
/* Header/inline buttons */
}
/* flask/app/static/css/modal.css */
.modal {
/* Modal overlay and positioning */
}
```
**Sizing standards:**
Set up comprehensive monitoring:
**Environment variables in docker-compose.yml:**
```yaml
environment:
- DD_ENV=production
- DD_VERSION=1.0.0
- DD_SITE=datadoghq.com
- DD_API_KEY=${DD_API_KEY}
- DD_APPLICATION_ID=${DD_APPLICATION_ID}
- DD_CLIENT_TOKEN=${DD_CLIENT_TOKEN}
```
**Frontend RUM:**
```javascript
// In HTML template
<script>
window.DD_RUM.init({
clientToken: '{{ dd_client_token }}',
applicationId: '{{ dd_application_id }}'
});
</script>
```
Create fallback prompt file:
```
You are a helpful AI assistant...
```
**Usage:**
```bash
redis-cli HGETALL chat:user123
redis-cli GET prompt:user123
```
**Icons:**
1. **Always use application name "LLM-2000"** in code, docs, and UI
2. **Maintain this documentation** when:
- Adding/removing components or services
- Changing file locations or renaming files
- Modifying API endpoints
- Adding CSS modules or major selectors
- Changing data structures or dependencies
3. **Follow size standards**: 32px header buttons, 40px main buttons
4. **Use Remix Icon set** with consistent fill variants
5. **Preserve Redis key patterns**: `chat:{user_id}`, `prompt:{user_id}`
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/llm-2000-flask-chat-application/raw