Integrate LangChain agents and graphs with AI SDK UI components. Convert messages between formats, stream responses, and use LangSmith deployments.
Seamlessly integrate LangChain agents and graphs with AI SDK UI components using the `@ai-sdk/langchain` adapter. This skill covers message conversion, stream transformation, and LangSmith deployment integration.
Install the required packages:
```bash
npm install @ai-sdk/langchain @langchain/core
```
Note: `@langchain/core` is a required peer dependency.
Convert AI SDK `UIMessage` objects to LangChain `BaseMessage` format:
```typescript
import { toBaseMessages } from '@ai-sdk/langchain';
// Convert UI messages to LangChain format
const langchainMessages = await toBaseMessages(uiMessages);
// Use with any LangChain model
const response = await model.invoke(langchainMessages);
```
Transform LangGraph streams to AI SDK format:
```typescript
import { toBaseMessages, toUIMessageStream } from '@ai-sdk/langchain';
import { createUIMessageStreamResponse } from 'ai';
const langchainMessages = await toBaseMessages(uiMessages);
const langchainStream = await graph.stream(
{ messages: langchainMessages },
{ streamMode: ['values', 'messages'] }
);
return createUIMessageStreamResponse({
stream: toUIMessageStream(langchainStream)
});
```
Use `streamEvents()` for granular event handling:
```typescript
import { toBaseMessages, toUIMessageStream } from '@ai-sdk/langchain';
const langchainMessages = await toBaseMessages(uiMessages);
const streamEvents = agent.streamEvents(
{ messages: langchainMessages },
{ version: 'v2' }
);
return createUIMessageStreamResponse({
stream: toUIMessageStream(streamEvents)
});
```
The adapter automatically handles:
Emit custom data events from LangChain tools:
```typescript
import { tool, type ToolRuntime } from 'langchain';
import { z } from 'zod';
const analyzeDataTool = tool(
async ({ query }, config: ToolRuntime) => {
// Emit progress updates - becomes 'data-progress' in the UI
config.writer?.({
type: 'progress',
id: 'analysis-1', // Include 'id' to persist in message.parts
step: 'fetching',
message: 'Fetching data...',
progress: 50
});
// ... perform analysis ...
// Emit status update - becomes 'data-status' in the UI
config.writer?.({
type: 'status',
id: 'analysis-1-status',
status: 'complete',
message: 'Analysis finished'
});
return 'Analysis complete';
},
{
name: 'analyze_data',
description: 'Analyze data with progress updates',
schema: z.object({ query: z.string() })
}
);
```
Enable custom stream mode:
```typescript
const stream = await graph.stream(
{ messages: langchainMessages },
{ streamMode: ['values', 'messages', 'custom'] }
);
```
**Custom data behavior:**
Connect to LangGraph deployments from the browser:
```tsx
import { useChat } from 'ai/react';
import { LangSmithDeploymentTransport } from '@ai-sdk/langchain';
import { useMemo } from 'react';
function Chat() {
const transport = useMemo(
() => new LangSmithDeploymentTransport({
url: 'https://your-deployment.us.langgraph.app',
apiKey: process.env.LANGSMITH_API_KEY
}),
[]
);
const { messages, input, handleInputChange, handleSubmit } = useChat({
transport
});
return (
<div>
{messages.map(m => (
<div key={m.id}>{m.parts.map(part => part.text).join('')}</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit">Send</button>
</form>
</div>
);
}
```
Converts AI SDK UI messages to LangChain BaseMessage objects.
Converts AI SDK ModelMessage objects to LangChain BaseMessage objects.
Converts LangChain/LangGraph streams to AI SDK UIMessageStream.
**Supported stream types:**
**Supported events:**
ChatTransport implementation for LangSmith/LangGraph deployments.
**Constructor options:**
When implementing LangChain integration:
1. Install dependencies: `@ai-sdk/langchain` and `@langchain/core`
2. Import conversion utilities: `toBaseMessages`, `toUIMessageStream`
3. Convert UI messages to LangChain format before invoking agents/graphs
4. Transform LangChain streams back to UI format using `toUIMessageStream`
5. For custom data, add `config.writer?.()` calls in tools with appropriate `type` and optional `id`
6. Enable necessary stream modes (`messages`, `values`, `custom`)
7. For browser deployments, use `LangSmithDeploymentTransport` with `useChat` hook
**Basic agent streaming:**
```typescript
const langchainMessages = await toBaseMessages(messages);
const stream = agent.streamEvents({ messages: langchainMessages }, { version: 'v2' });
return createUIMessageStreamResponse({ stream: toUIMessageStream(stream) });
```
**Graph with custom data:**
```typescript
const langchainMessages = await toBaseMessages(messages);
const stream = await graph.stream(
{ messages: langchainMessages },
{ streamMode: ['values', 'messages', 'custom'] }
);
return createUIMessageStreamResponse({ stream: toUIMessageStream(stream) });
```
**Browser deployment:**
```tsx
const transport = new LangSmithDeploymentTransport({
url: deploymentUrl,
apiKey: apiKey
});
const { messages, ... } = useChat({ transport });
```
Refer to the [AI SDK documentation](https://ai-sdk.dev) for complete guides and examples.
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/ai-sdk-langchain-integration/raw