Install and configure @langchain/mistralai package for chat models and embeddings. Sets up MistralAI integration with proper dependency resolution and environment configuration.
Install and configure the @langchain/mistralai package for LangChain.js projects. This skill handles installation, dependency resolution, environment setup, and provides implementation examples for chat models and embeddings.
Follow these steps to install and configure the LangChain MistralAI integration:
Install both @langchain/mistralai and @langchain/core:
```bash
npm install @langchain/mistralai @langchain/core
```
Update package.json to ensure all LangChain packages use the same @langchain/core version. Add the following fields to prevent version conflicts:
```json
{
"resolutions": {
"@langchain/core": "^0.3.0"
},
"overrides": {
"@langchain/core": "^0.3.0"
},
"pnpm": {
"overrides": {
"@langchain/core": "^0.3.0"
}
}
}
```
These fields cover yarn (resolutions), npm (overrides), and pnpm package managers.
Configure the Mistral API key in your environment:
**For development (.env.local or .env):**
```
MISTRAL_API_KEY=your-api-key-here
```
**For production environments:** Set the MISTRAL_API_KEY environment variable in your hosting platform.
Create or update files with example code:
**Chat Model Example (standard invocation):**
```typescript
import { ChatMistralAI } from "@langchain/mistralai";
import { HumanMessage } from "@langchain/core/messages";
const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
modelName: "mistral-small",
});
const response = await model.invoke(new HumanMessage("Hello world!"));
```
**Chat Model Example (streaming):**
```typescript
import { ChatMistralAI } from "@langchain/mistralai";
import { HumanMessage } from "@langchain/core/messages";
const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
modelName: "mistral-small",
});
const response = await model.stream(new HumanMessage("Hello world!"));
```
**Embeddings Example:**
```typescript
import { MistralAIEmbeddings } from "@langchain/mistralai";
const embeddings = new MistralAIEmbeddings({
apiKey: process.env.MISTRAL_API_KEY,
});
const res = await embeddings.embedQuery("Hello world");
```
After installation, verify that:
**Basic usage:**
"Install LangChain MistralAI integration"
**With specific configuration:**
"Set up MistralAI for LangChain with embeddings support"
**For existing projects:**
"Add MistralAI chat models to my LangChain project"
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/langchain-mistralai-integration/raw