Install and configure the Google Vertex AI provider for the AI SDK, supporting both Gemini and Anthropic Claude models in Node.js and Edge runtimes with authentication and prompt caching.
Configure the Google Vertex AI provider for the AI SDK to use Gemini and Anthropic Claude models in Node.js or Edge runtimes.
This skill helps AI coding agents install and configure the `@ai-sdk/google-vertex` package, supporting:
Install the Google Vertex AI provider package:
```bash
npm i @ai-sdk/google-vertex
```
Ask the user which runtime environment they are using:
#### For Node.js Runtime
The Node.js provider supports all standard Google Cloud authentication methods via `google-auth-library`. The most common method is:
1. Obtain credentials JSON from [Google Cloud Console](https://console.cloud.google.com/apis/credentials)
2. Set the `GOOGLE_APPLICATION_CREDENTIALS` environment variable to the path of the credentials file
Example `.env` file:
```
GOOGLE_APPLICATION_CREDENTIALS=/path/to/credentials.json
```
#### For Edge Runtime
The Edge provider requires Application Default Credentials via environment variables:
```
GOOGLE_CLIENT_EMAIL=your-client-email@project.iam.gserviceaccount.com
GOOGLE_PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
GOOGLE_PRIVATE_KEY_ID=key-id-here
```
These values can be extracted from a credentials JSON file obtained from Google Cloud Console.
#### For Google Gemini Models
**Node.js runtime:**
```typescript
import { vertex } from '@ai-sdk/google-vertex';
import { generateText } from 'ai';
const { text } = await generateText({
model: vertex('gemini-1.5-flash'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
**Edge runtime:**
```typescript
import { vertex } from '@ai-sdk/google-vertex/edge';
import { generateText } from 'ai';
const { text } = await generateText({
model: vertex('gemini-1.5-flash'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
#### For Anthropic Claude Models via Google Vertex
**Node.js runtime:**
```typescript
import { vertexAnthropic } from '@ai-sdk/google-vertex/anthropic';
import { generateText } from 'ai';
const { text } = await generateText({
model: vertexAnthropic('claude-3-5-sonnet@20240620'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
**Edge runtime:**
```typescript
import { vertexAnthropic } from '@ai-sdk/google-vertex/anthropic/edge';
import { generateText } from 'ai';
const { text } = await generateText({
model: vertexAnthropic('claude-3-5-sonnet@20240620'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
If the user needs custom configuration (specific project ID, location, or credentials), create a custom provider instance.
**Node.js with custom auth:**
```typescript
import { createVertex } from '@ai-sdk/google-vertex';
import { generateText } from 'ai';
const customProvider = createVertex({
project: 'your-project-id',
location: 'us-central1',
googleAuthOptions: {
credentials: {
client_email: 'your-client-email',
private_key: 'your-private-key',
},
},
});
const { text } = await generateText({
model: customProvider('gemini-1.5-flash'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
**Edge with custom credentials:**
```typescript
import { createVertex } from '@ai-sdk/google-vertex/edge';
import { generateText } from 'ai';
const customProvider = createVertex({
project: 'your-project-id',
location: 'us-central1',
googleCredentials: {
clientEmail: 'your-client-email',
privateKey: 'your-private-key',
},
});
const { text } = await generateText({
model: customProvider('gemini-1.5-flash'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
**Anthropic custom configuration (Node.js):**
```typescript
import { createVertexAnthropic } from '@ai-sdk/google-vertex/anthropic';
import { generateText } from 'ai';
const customProvider = createVertexAnthropic({
project: 'your-project-id',
location: 'us-east5',
});
const { text } = await generateText({
model: customProvider('claude-3-5-sonnet@20240620'),
prompt: 'Write a vegetarian lasagna recipe.',
});
```
If using Anthropic Claude models and the user wants to optimize costs and latency, implement prompt caching:
```typescript
import { vertexAnthropic } from '@ai-sdk/google-vertex/anthropic';
import { generateText } from 'ai';
import fs from 'node:fs';
const errorMessage = fs.readFileSync('data/error-message.txt', 'utf8');
const result = await generateText({
model: vertexAnthropic('claude-3-5-sonnet-v2@20241022', {
cacheControl: true,
}),
messages: [
{
role: 'user',
content: [
{
type: 'text',
text: 'You are a JavaScript expert.',
},
{
type: 'text',
text: `Error message: ${errorMessage}`,
providerOptions: {
anthropic: {
cacheControl: { type: 'ephemeral' },
},
},
},
{
type: 'text',
text: 'Explain the error message.',
},
],
},
],
});
console.log(result.text);
console.log(result.providerMetadata?.anthropic);
// e.g. { cacheCreationInputTokens: 2118, cacheReadInputTokens: 0 }
```
**Key points about prompt caching:**
Create a test file and run it to verify the configuration works:
```typescript
import { vertex } from '@ai-sdk/google-vertex'; // or /edge or /anthropic
import { generateText } from 'ai';
async function test() {
const { text } = await generateText({
model: vertex('gemini-1.5-flash'), // or vertexAnthropic('claude-3-5-sonnet@20240620')
prompt: 'Say hello!',
});
console.log(text);
}
test().catch(console.error);
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/google-vertex-ai-provider-setup/raw