A framework for optimizing LLM prompts using the ERM (Exemplar and Reflection Memory) approach in JavaScript. Iteratively refines prompts by collecting feedback and exemplars to improve performance on classification and other LLM tasks.
A JavaScript SDK implementation of the ERM (Exemplar and Reflection Memory) framework for optimizing Large Language Model prompts through iterative refinement.
This skill guides you in building and using a JavaScript-based prompt optimization framework that leverages feedback and exemplars to iteratively improve LLM prompt performance. The framework follows DRY and SOLID principles, emphasizing modularity, maintainability, and robust error handling.
The framework consists of several key classes:
Create a new JavaScript/Node.js project with the following structure:
```
project/
├── src/
│ ├── prompt_builder.js
│ ├── memory.js
│ ├── erm_optimizer.js
│ └── utils.js
├── config/
│ └── api_config.js
└── package.json
```
**PromptBuilder**: Create a class that constructs prompts incrementally with methods like `addSection()`, `addExample()`, and `build()`.
**Memory Classes**: Implement `MemoryBase` with shared functionality for score management, then extend it with `FeedbackMemory` and `ExemplarFactory` for specialized storage and retrieval.
**ERM Optimizer**: Build the main optimization class that:
**Code Quality**:
**Error Handling**:
**Functional Programming**:
1. **Initialize**: Load initial prompt and data
2. **Evaluate**: Run prompt against validation data
3. **Collect Feedback**: Identify failures and generate feedback
4. **Store Exemplars**: Save successful examples with metadata
5. **Refine Prompt**: Use feedback and exemplars to improve prompt
6. **Iterate**: Repeat evaluation and refinement for N steps
7. **Return**: Output final prompt and performance metrics
```javascript
const { ERM } = require('./erm_optimizer');
const logger = require('winston');
// Prepare data
const trainData = [
{ input: "Turn left at the next street.", target: "navigation" },
{ input: "E=mc^2", target: "physics" }
];
const valData = [
{ input: "The capital of France is Paris.", target: "geography" }
];
const testData = [
{ input: "Photosynthesis occurs in the chloroplast.", target: "biology" }
];
// Initial prompt
const initialPrompt = `
Based on the input text, classify it into one of the predefined categories.
Provide the category label only.
Text: {input}
Label:
`.trim();
// Optimize
const erm = new ERM();
const [finalScore, finalPrompt] = await erm.optimize(
initialPrompt,
trainData,
valData,
5 // optimization steps
);
logger.info(`Final validation score: ${finalScore}`);
logger.info(`Optimized prompt:\n${finalPrompt}`);
// Evaluate on test set
const testResults = await erm.evaluate(finalPrompt, testData);
const testScore = testResults.filter(r => r).length / testResults.length;
logger.info(`Test score: ${testScore}`);
```
When building prompts for LLM optimization:
Install required packages:
```bash
npm install winston dotenv
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/javascript-erm-prompt-optimization/raw