Fine-tuned Turkish Llama 8B model that performs function calling tasks in Turkish using JSON schema definitions
A fine-tuned Turkish Llama 8B model specialized in performing function calling tasks using JSON schema definitions. The model understands Turkish instructions and can generate appropriate function calls based on user queries.
This skill enables AI agents to:
Load the Turkish function calling model using the Unsloth library for optimized inference:
```python
import json
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
model_name="atasoglu/Turkish-Llama-3-8B-function-calling",
load_in_4bit=True,
)
FastLanguageModel.for_inference(model)
```
Configure the prompt templates that instruct the model on function calling behavior:
```python
system_prompt = """Sen yardımsever, akıllı ve fonksiyon çağrısı yapabilen bir asistansın.
Aşağıda JSON parçası içinde verilen fonksiyonları kullanarak kullanıcının sorusunu uygun şekilde cevaplamanı istiyorum.
Fonksiyon çağrısı yaparken uyman gereken talimatlar:
Bu talimatlara uyarak soruları cevaplandır."""
user_prompt = """### Fonksiyonlar
```json
{tools}
```
{query}"""
```
Specify the functions available to the model and the user's query:
```python
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Bogotá, Colombia",
}
},
"required": ["location"],
"additionalProperties": False,
},
"strict": True,
},
}
]
query = "Paris'te hava şu anda nasıl?"
messages = [
{"role": "system", "content": system_prompt},
{
"role": "user",
"content": user_prompt.format(
tools=json.dumps(tools, ensure_ascii=False),
query=query,
),
},
]
```
Tokenize inputs and generate the model's response:
```python
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
return_dict=True,
return_tensors="pt",
).to("cuda")
generation_kwargs = dict(
do_sample=True,
use_cache=True,
max_new_tokens=500,
temperature=0.3,
top_p=0.9,
top_k=40,
)
outputs = model.generate(**inputs, **generation_kwargs)
output_ids = outputs[:, inputs["input_ids"].shape[1] :]
generated_texts = tokenizer.batch_decode(output_ids, skip_special_tokens=True)
```
Extract and validate the function call from the model's response:
```python
import re
def eval_function_calling(text):
match_ = re.search(r"```json(.*)```", text, re.DOTALL)
if match_ is None:
return False, text
return True, json.loads(match_.group(1).strip())
has_function_calling, results = eval_function_calling(generated_texts[0])
if has_function_calling:
for result in results:
fn = result["function"]
name, args = fn["name"], fn["arguments"]
print(f"Calling {name!r} function with these arguments: {args}")
else:
print(f"No function call: {results!r}")
```
**Input Query:** "Paris'te hava şu anda nasıl?" (What's the weather in Paris right now?)
**Output:**
```
Calling 'get_weather' function with these arguments: {"location":"Paris, France"}
```
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/turkish-function-calling-with-llama/raw