A library for integrating LLM functionality into Grafana plugins with streaming chat completions, OpenAI compatibility, and MCP support. Includes Jest configuration helpers for ES modules.
Integrate LLM functionality into Grafana plugins using the `@grafana/llm` library. This skill provides streaming chat completions, OpenAI-compatible APIs, and proper Jest configuration for testing.
The `@grafana/llm` library provides convenience functions and components for working with LLMs in Grafana plugin development. It supports streaming responses, multiple model sizes, and includes helpers for test configuration.
Add `@grafana/llm` to your plugin's `package.json`:
```json
{
"dependencies": {
"@grafana/llm": "0.22.0"
}
}
```
Run `npm install` or `yarn install` to install the dependency.
Create a component that uses the `openai` object from `@grafana/llm`:
```typescript
import React, { useState } from 'react';
import { useAsync } from 'react-use';
import { scan } from 'rxjs/operators';
import { openai } from '@grafana/llm';
import { Button, Input, Spinner } from '@grafana/ui';
const LLMChatComponent = (): JSX.Element => {
const [input, setInput] = useState('');
const [message, setMessage] = useState('');
const [reply, setReply] = useState('');
const { loading, error } = useAsync(async () => {
const enabled = await openai.enabled();
if (!enabled) {
return false;
}
if (message === '') {
return;
}
const stream = openai
.streamChatCompletions({
// model: openai.Model.LARGE, // Use LARGE for longer context/complex tasks
messages: [
{
role: 'system',
content: 'You are a helpful assistant with deep knowledge of Grafana, Prometheus and observability.'
},
{ role: 'user', content: message },
],
})
.pipe(scan((acc, delta) => acc + delta, ''));
return stream.subscribe(setReply);
}, [message]);
if (error) {
return <div>Error: {error.message}</div>;
}
return (
<div>
<Input
value={input}
onChange={(e) => setInput(e.currentTarget.value)}
placeholder="Ask about Grafana or observability"
/>
<Button type="submit" onClick={() => setMessage(input)}>
Submit
</Button>
<div>{loading ? <Spinner /> : reply}</div>
</div>
);
};
```
If you encounter `SyntaxError: Cannot use import statement outside a module` errors in tests, update your Jest configuration:
```javascript
// jest.config.js
const { grafanaESModules, nodeModulesToTransform } = require('./.config/jest/utils');
const { grafanaLLMESModules } = require('@grafana/llm/jest');
module.exports = {
...require('./.config/jest.config'),
transformIgnorePatterns: [
nodeModulesToTransform([...grafanaESModules, ...grafanaLLMESModules])
],
};
```
If using MCP (Model Context Protocol) features, add the `TransformStream` polyfill:
```javascript
// jest-setup.js
import './.config/jest-setup';
import { TransformStream } from 'node:stream/web';
import { TextEncoder } from 'util';
global.TextEncoder = TextEncoder;
global.TransformStream = TransformStream;
```
Choose the appropriate model based on your use case:
The library uses RxJS observables for streaming:
1. **Observability assistant**: Answer questions about metrics, logs, and traces
2. **Query builder**: Generate PromQL or LogQL queries from natural language
3. **Dashboard helper**: Suggest visualizations and panel configurations
4. **Alert assistant**: Help users write alert rules and conditions
5. **Data exploration**: Provide insights about time series data
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/grafana-llm-integration/raw