Fine-tuned Llama 3.1 8B model specialized for function calling tasks, trained on the xlam-function-calling-60k dataset using reward-model-filtered data.
A specialized fine-tuned version of Meta's Llama 3.1 8B Instruct model, optimized for function calling tasks. This model enables AI agents to understand user queries and generate appropriate function calls with correct parameters.
This skill provides access to a function-calling-optimized language model that can:
The model was trained on a curated subset of the Salesforce xlam-function-calling-60k dataset, selected using a reward model for optimal performance.
When a user requests function calling capabilities or needs to convert natural language queries into API/function calls:
1. **Install Required Dependencies**
- Install the `unsloth` library for optimized model loading
- Install `transformers` for model inference
- Ensure CUDA is available for GPU acceleration
2. **Load the Model**
- Use `FastLanguageModel.from_pretrained()` with model name `kesimeg/function-calling-llama-3.1-8B`
- Set max sequence length to 4096 tokens
- Apply the Llama 3.1 chat template to the tokenizer
- Enable inference mode with `FastLanguageModel.for_inference()`
3. **For Simple Instruction Prompts**
- Create a conversation with user role and content
- Apply chat template without tools
- Tokenize and generate response with streaming
- Example: "What is the integral of cos(x)"
4. **For Function Calling Tasks**
- Define the query requesting a specific function call
- Provide tool definitions as JSON array with:
- Function name
- Description
- Parameters with types and descriptions
- Apply chat template WITH the tools object
- Replace `"parameters": d` with `"arguments": d` in the formatted text (dataset convention)
- Re-apply chat template to wrap as user message
- Generate with streaming enabled
5. **Output Format**
- The model will generate structured function calls
- Outputs include function name and arguments in proper format
- Stream results to show progress during generation
**Simple query:**
```
User: "What is the integral of cos(x)"
Model generates: Instructional response about integration
```
**Function calling query:**
```
User: "I want to skip the first 200 products and get 15 products from the catalog"
Tool: get_products(limit: int, skip: int)
Model generates: get_products(limit=15, skip=200)
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/llama-31-function-calling/raw