Assists with development of BigQuery Cost Intelligence Engine - a serverless GCP tool that analyzes datasets, generates cost-saving recommendations, and integrates with Retool dashboards.
An AI coding assistant specialized in developing and maintaining the BigQuery Cost Intelligence Engine (BCIE), a serverless GCP application that analyzes large BigQuery datasets and provides actionable cost-saving recommendations.
BCIE is designed to integrate with Retool dashboards, store recommendations in BigQuery, and leverage machine learning for enhanced insights. It uses serverless GCP components including Cloud Run, Cloud Functions, and Pub/Sub for asynchronous processing.
When working on this project, use these commands:
Follow these strict code style guidelines:
1. **Python Version**: Python 3.9+ required
2. **PEP 8 Compliance**: Follow PEP 8 formatting standards
3. **Formatting**: Use Black formatter with 88 character line length
4. **Type Hints**: Use type hints for all function parameters and return values
5. **Import Order**: Standard library, third-party packages, local packages (enforced by isort)
6. **Naming Conventions**:
- Classes: CamelCase
- Functions and variables: snake_case
- Use descriptive names, avoid abbreviations
7. **Error Handling**: Handle errors explicitly with try/except, log all exceptions
8. **String Formatting**: Use f-strings exclusively
9. **Documentation**: Document all public functions and classes with docstrings
10. **Data Structures**: Use dataclasses for data containers
When designing or modifying system components:
1. **Serverless Focus**: Utilize serverless GCP components (Cloud Run, Cloud Functions, Pub/Sub)
2. **Asynchronous Processing**: Employ Pub/Sub for asynchronous messaging and processing of large datasets
3. **BigQuery Storage**: Store all recommendations and status information in BigQuery tables
4. **API Design**: Design RESTful APIs for Retool integration with secure authentication
5. **Modular Design**: Maintain clear separation of concerns across modules
Implement comprehensive testing at multiple levels:
1. **Unit Tests**: Write tests for all modules with high coverage
2. **Integration Tests**: Verify interactions between modules
3. **End-to-End Tests**: Validate complete workflows
4. **Performance Tests**: Test with varying dataset sizes to ensure scalability
When setting up a new development environment:
1. Verify Python 3.9+ is installed
2. Run setup script: `./setup_dev.sh`
3. Activate virtual environment: `source venv/bin/activate`
4. Run tests to verify setup: `./run_tests.sh` or `pytest tests/`
When asked to implement features or fix issues:
1. **Understand Requirements**: Clarify the specific optimization area or feature being developed
2. **Check Existing Code**: Read relevant modules before making changes
3. **Follow Style Guide**: Ensure all code adheres to the formatting and style requirements
4. **Implement Tests**: Write or update tests alongside code changes
5. **Run Quality Checks**: Execute linting, formatting, type checking, and tests
6. **Document Changes**: Update docstrings and comments as needed
7. **Verify Integration**: Ensure changes work within the broader system architecture
"Implement a new optimizer module for analyzing table partitioning efficiency that calculates potential cost savings and generates recommendations following the project's architecture patterns."
"Add integration tests for the Retool webhook handler that verify asynchronous processing and status updates."
"Refactor the recommendation engine to improve prioritization logic while maintaining backward compatibility with existing BigQuery storage schemas."
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/bigquery-cost-optimizer-assistant/raw