Function calling with Llama 3 8B model optimized by Groq for structured tool/function invocation with JSON output
A specialized Llama 3 8B model fine-tuned by Groq for function calling and tool use. Returns structured JSON responses for seamless integration with external tools and APIs.
This skill enables AI agents to call external functions and tools using a Llama 3 8B model optimized for structured output. The model accepts function schemas in XML format and returns properly formatted JSON function calls with arguments.
1. Download one of the quantized GGUF models from the source (Q4_K_M or Q5_K_M recommended for balanced quality/size)
2. Ensure you have LlamaEdge (v0.12.4+) or llama.cpp installed
3. Set the prompt template to `groq-llama3-tool` format
The model expects this structure:
```
<|start_header_id|>system<|end_header_id|>
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{"name": <function-name>,"arguments": <args-dict>}
</tool_call>
Here are the available tools:
<tools>
[Insert JSON function schemas here]
</tools><|eot_id|><|start_header_id|>user<|end_header_id|>
[User query]<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
Define tools using JSON Schema within `<tools>` tags:
```json
{
"name": "function_name",
"description": "What the function does",
"parameters": {
"type": "object",
"properties": {
"param1": {
"type": "string",
"description": "Parameter description"
}
},
"required": ["param1"]
}
}
```
```bash
wasmedge --dir .:. --nn-preload default:GGML:AUTO:Llama-3-Groq-8B-Tool-Use-Q5_K_M.gguf \
llama-api-server.wasm \
--prompt-template groq-llama3-tool \
--ctx-size 8192 \
--model-name Llama-3-Groq-8B
```
```bash
./llama-cli -m Llama-3-Groq-8B-Tool-Use-Q5_K_M.gguf \
-p "<prompt>" \
-n 512 \
-c 8192
```
The model returns function calls in this format:
```xml
<tool_call>
{"name": "get_current_weather", "arguments": {"location": "San Francisco, CA", "unit": "celsius"}}
</tool_call>
```
**Input:**
```
Tools: get_weather(location, unit), search_web(query)
Query: What's the weather in Paris?
```
**Output:**
```xml
<tool_call>
{"name": "get_weather", "arguments": {"location": "Paris, France", "unit": "celsius"}}
</tool_call>
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/llama-3-groq-tool-use-6wo6fv/raw