Planning agent built on Qwen2.5-7B-Instruct for multi-step task decomposition and execution planning in agentic workflows
A specialized planning agent built on Qwen2.5-7B-Instruct, designed for breaking down complex tasks into actionable steps and orchestrating multi-step agentic workflows.
The AgentFlow Planner 7B agent specializes in:
This model is part of the AgentFlow framework for building sophisticated multi-agent systems.
When using the AgentFlow Planner 7B model, follow these guidelines:
Start by clearly understanding the user's high-level goal:
Use the model to create a step-by-step execution plan:
Before execution:
Present the plan in a clear, structured format:
```
Goal: [High-level objective]
Plan:
1. [Step 1 name]
- Action: [What to do]
- Input: [Required inputs]
- Output: [Expected output]
- Dependencies: [Prerequisites]
2. [Step 2 name]
- Action: [What to do]
- Input: [Required inputs]
- Output: [Expected output]
- Dependencies: [Step 1]
[Continue for all steps...]
Execution Order: [Recommended sequence]
```
If execution reveals issues:
**User Request:** "Build a web scraper that collects product data from multiple e-commerce sites and stores it in a database."
**AgentFlow Planner Response:**
```
Goal: Create a multi-site web scraper with database storage
Plan:
1. Environment Setup
- Action: Install required libraries (requests, BeautifulSoup, SQLAlchemy)
- Input: Requirements list
- Output: Configured development environment
- Dependencies: None
2. Database Schema Design
- Action: Define product table structure (name, price, URL, timestamp)
- Input: Data fields to capture
- Output: Database schema file
- Dependencies: Step 1
3. Scraper Module Development
- Action: Write scraping functions for each target site
- Input: Target URLs, selectors for data extraction
- Output: Reusable scraper functions
- Dependencies: Step 1
4. Database Connection Layer
- Action: Implement SQLAlchemy models and CRUD operations
- Input: Database schema from Step 2
- Output: Database interface module
- Dependencies: Steps 1, 2
5. Data Pipeline Integration
- Action: Connect scraper output to database storage
- Input: Scraper functions, database interface
- Output: End-to-end pipeline
- Dependencies: Steps 3, 4
6. Error Handling & Logging
- Action: Add try-except blocks, implement logging
- Input: Pipeline from Step 5
- Output: Production-ready scraper with error handling
- Dependencies: Step 5
Execution Order: Sequential (1→2→3→4→5→6)
```
- Research Paper: https://huggingface.co/papers/date/2025-10-08
- GitHub: https://github.com/lupantech/AgentFlow
- Demo: https://huggingface.co/spaces/AgentFlow/agentflow
- Website: https://agentflow.stanford.edu/
Use this agent when you need intelligent task decomposition before executing complex multi-step operations.
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/agentflow-planner-7b/raw