A 70B parameter Llama 3 model fine-tuned by Groq for function calling and tool use, optimized for structured JSON tool invocations with natural language understanding.
A specialized Llama 3 70B model fine-tuned by Groq for function calling and tool use. This model excels at understanding natural language requests and translating them into structured function calls with appropriate arguments.
This skill provides access to Groq's Llama 3 70B Tool Use model in GGUF format, quantized by Second State Inc. The model is designed to interpret user queries and invoke functions with precise JSON-formatted arguments, making it ideal for building AI agents that need to interact with APIs, databases, or other tools.
Select an appropriate quantization based on your hardware:
**As a Service (API Server)**:
```bash
wasmedge --dir .:. --nn-preload default:GGML:AUTO:Llama-3-Groq-70B-Tool-Use-Q5_K_M.gguf \
llama-api-server.wasm \
--prompt-template groq-llama3-tool \
--ctx-size 8192 \
--model-name Llama-3-Groq-70B
```
**As a Command-Line App**:
```bash
wasmedge --dir .:. --nn-preload default:GGML:AUTO:Llama-3-Groq-70B-Tool-Use-Q5_K_M.gguf \
llama-chat.wasm \
--prompt-template groq-llama3-tool \
--ctx-size 8192
```
The model expects function definitions in XML `<tools>` tags within the system prompt:
```text
<|start_header_id|>system<|end_header_id|>
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{"name": <function-name>,"arguments": <args-dict>}
</tool_call>
Here are the available tools:
<tools>
[Your JSON function schemas here]
</tools><|eot_id|><|start_header_id|>user<|end_header_id|>
[User query]<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
Define tools using JSON schema format:
```json
{
"name": "function_name",
"description": "What this function does",
"parameters": {
"type": "object",
"properties": {
"param_name": {
"type": "string",
"description": "Parameter description",
"enum": ["option1", "option2"]
}
},
"required": ["param_name"]
}
}
```
The model returns tool calls wrapped in XML tags:
```xml
<tool_call>
{"name": "get_current_weather","arguments": {"location": "San Francisco, CA", "unit": "celsius"}}
</tool_call>
```
**User Query**: "What's the weather like in San Francisco in Celsius?"
**Model Output**:
```xml
<tool_call>
{"name": "get_current_weather","arguments": {"location": "San Francisco, CA", "unit": "celsius"}}
</tool_call>
```
1. **Download the model file** from HuggingFace (choose appropriate quantization)
2. **Install LlamaEdge runtime** (v0.12.5 or later recommended)
3. **Define your tool schemas** in JSON format with clear descriptions
4. **Format the system prompt** with tools wrapped in `<tools>` XML tags
5. **Initialize the model** with `groq-llama3-tool` prompt template
6. **Send user queries** to the model with your tool definitions
7. **Parse the XML-wrapped JSON output** to extract function name and arguments
8. **Execute the requested function** with the provided arguments
9. **Return results** to the user in natural language
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/llama-3-groq-70b-tool-use/raw