A specialized skill for building intelligent chat experiences with persistent, stateful AI agents using @cloudflare/ai-chat. Provides streaming responses, automatic message persistence, resumable streams, and tool integration.
Build intelligent, persistent chat experiences with AI agents that remember context, stream responses in real-time, and seamlessly integrate tools using the `@cloudflare/ai-chat` package.
This skill helps you implement AI-powered chat applications using Cloudflare's `@cloudflare/ai-chat` package. It provides:
When the user wants to create an AI chat application:
1. Install required dependencies:
```bash
npm install @cloudflare/ai-chat agents ai
```
2. Install AI provider SDK (choose based on user preference):
```bash
npm install @ai-sdk/openai # or @ai-sdk/anthropic, @ai-sdk/google, etc.
```
3. Create a chat agent class extending `AIChatAgent`:
- Import `AIChatAgent` from `@cloudflare/ai-chat`
- Import AI SDK utilities: `streamText`, `convertToModelMessages`, `createUIMessageStream`, `createUIMessageStreamResponse`
- Override `onChatMessage()` method to handle chat logic
- Return streaming or non-streaming responses
4. Configure Durable Objects in `wrangler.toml`:
- Add `durable_objects.bindings` entry with agent class name
- Add migration with `new_sqlite_classes` for persistence
For standard streaming chat responses:
```ts
import { AIChatAgent } from "@cloudflare/ai-chat";
import { openai } from "@ai-sdk/openai";
import {
streamText,
convertToModelMessages,
createUIMessageStream,
createUIMessageStreamResponse
} from "ai";
export class ChatAgent extends AIChatAgent<Env> {
async onChatMessage() {
const stream = createUIMessageStream({
execute: async ({ writer }) => {
const result = streamText({
model: openai("gpt-4o"),
messages: await convertToModelMessages(this.messages)
});
writer.merge(result.toUIMessageStream());
}
});
return createUIMessageStreamResponse({ stream });
}
}
```
**Key Features:**
For simpler use cases without real-time updates:
```ts
import { AIChatAgent } from "@cloudflare/ai-chat";
import { openai } from "@ai-sdk/openai";
import { generateText, convertToModelMessages } from "ai";
export class SimpleChat extends AIChatAgent<Env> {
async onChatMessage() {
const result = await generateText({
model: openai("gpt-4o"),
messages: await convertToModelMessages(this.messages)
});
const message = result.toUIMessage({
metadata: {
model: "gpt-4o",
totalTokens: result.usage?.totalTokens
}
});
return new Response(JSON.stringify(message), {
headers: { "Content-Type": "application/json" }
});
}
}
```
Implement the chat interface using React hooks:
```tsx
import { useAgent } from "agents/react";
import { useAgentChat } from "@cloudflare/ai-chat/react";
function ChatInterface() {
const agent = useAgent({
agent: "ChatAgent",
name: "my-chat"
});
const { messages, sendMessage, clearHistory, status } = useAgentChat({
agent
});
const [input, setInput] = useState("");
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim()) return;
await sendMessage({
role: "user",
parts: [{ type: "text", text: input }]
});
setInput("");
};
return (
<div className="chat-interface">
<div className="messages">
{messages.map((message) => (
<div key={message.id} className="message">
<div className="role">{message.role}</div>
<div className="content">
{message.parts.map((part, i) => {
if (part.type === "text") return <span key={i}>{part.text}</span>;
return null;
})}
</div>
</div>
))}
</div>
<form onSubmit={handleSubmit}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
/>
</form>
<button onClick={clearHistory}>Clear Chat</button>
</div>
);
}
```
Add tools that execute on the server:
```ts
import { tool } from "ai";
import { z } from "zod";
export class ToolChat extends AIChatAgent<Env> {
async onChatMessage() {
const stream = createUIMessageStream({
execute: async ({ writer }) => {
const result = streamText({
model: openai("gpt-4o"),
messages: await convertToModelMessages(this.messages),
tools: {
getWeather: tool({
description: "Get weather for a city",
parameters: z.object({ city: z.string() }),
execute: async ({ city }) => {
const weather = await fetch(`https://api.weather.com/${city}`);
return { temperature: 72, condition: "sunny" };
}
})
}
});
writer.merge(result.toUIMessageStream());
}
});
return createUIMessageStreamResponse({ stream });
}
}
```
For tools that need browser access (DOM manipulation, user interactions):
**Server Agent:**
```ts
// Define tool without execute - client handles it
showAlert: tool({
description: "Shows an alert to the user",
parameters: z.object({ message: z.string() })
// No execute = client handles it
})
```
**Client Component:**
```tsx
const { messages, sendMessage, addToolResult } = useAgentChat({
agent,
onToolCall: async ({ toolCall, addToolOutput }) => {
if (toolCall.toolName === "showAlert") {
alert(toolCall.input.message);
addToolOutput({
toolCallId: toolCall.toolCallId,
output: { success: true }
});
}
}
});
```
For tools requiring user approval:
```tsx
const { messages, addToolResult } = useAgentChat({
agent,
toolsRequiringConfirmation: ["sendEmail", "deleteFile"],
onToolCall: async ({ toolCall, addToolOutput }) => {
if (toolCall.toolName === "sendEmail") {
const approved = confirm(`Send email to ${toolCall.input.recipient}?`);
if (approved) {
const result = await sendEmail(toolCall.input);
addToolOutput({
toolCallId: toolCall.toolCallId,
output: result,
autoContinue: true
});
}
}
}
});
```
Track analytics, timing, and custom data:
**Server:**
```ts
const stream = createUIMessageStream({
execute: async ({ writer }) => {
const result = streamText({
model: openai("gpt-4o"),
messages: await convertToModelMessages(this.messages)
});
writer.merge(
result.toUIMessageStream({
messageMetadata: ({ part }) => {
if (part.type === "start") {
return {
model: "gpt-4o",
createdAt: Date.now()
};
}
if (part.type === "finish") {
return {
totalTokens: part.totalUsage?.totalTokens
};
}
}
})
);
}
});
```
**Client:**
```tsx
{messages.map((message) => (
<div key={message.id}>
{message.metadata?.totalTokens && (
<span>{message.metadata.totalTokens} tokens</span>
)}
</div>
))}
```
**Custom Request Preparation:**
```tsx
const { sendMessage } = useAgentChat({
agent,
prepareSendMessagesRequest: ({ id, messages }) => ({
body: {
currentUrl: window.location.href,
userTimezone: Intl.DateTimeFormat().resolvedOptions().timeZone
},
headers: {
"X-Widget-Version": "1.0.0"
}
})
});
```
**Automatic Tool Continuation:**
```tsx
const { addToolResult } = useAgentChat({
agent,
autoContinueAfterToolResult: true,
onToolCall: async ({ toolCall, addToolOutput }) => {
const result = await executeTool(toolCall);
addToolOutput({
toolCallId: toolCall.toolCallId,
output: result,
autoContinue: true
});
}
});
```
Streams automatically resume on reconnect (enabled by default):
```tsx
const { messages, status } = useAgentChat({
agent,
resume: true // default behavior
});
// Streams automatically resume if user refreshes or reconnects
```
**Options:**
**Returns:**
1. **Durable Objects Required**: This package requires Cloudflare Durable Objects for persistence
2. **AI SDK v6**: Built for AI SDK version 6 - ensure compatibility
3. **Message Format**: Uses AI SDK message format with parts (`{ role, parts: [{ type, text }] }`)
4. **Streaming Persistence**: All streamed chunks are automatically persisted to SQLite
5. **Tool Execution**: Client tools run in browser; server tools run in Cloudflare Workers
6. **Resumption**: Stream resumption works automatically - no manual implementation needed
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/cloudflare-ai-chat-sdk/raw