Instructions for building a modular Python accessibility and speech application with Kivy UI, TTS, speech recognition, and cross-platform deployment support.
Instructions for building a modular Python accessibility and speech application with Kivy UI, text-to-speech, speech recognition, and deployment support for Windows, Android, and iOS.
The project is organized as a modular Python application focused on accessibility features:
The application uses the following key libraries:
When working on this project, follow these principles:
1. **Modular Architecture**: Keep all main functionality in separate modules under `src/`
2. **Accessibility First**: Prioritize screen reader compatibility and speech features
3. **Cross-Platform Support**: Ensure code works on Windows, Android, and iOS where applicable
4. **Dependency Management**: Add new dependencies to `requirements.txt` immediately
Use PyInstaller to create standalone executables:
```bash
pyinstaller --onefile --windowed src/main.py
```
Detailed instructions are in `README.md`.
Use Buildozer for Android deployment:
```bash
buildozer android debug
```
Use kivy-ios toolchain for iOS builds (requires macOS).
When adding new features:
1. Create a new module in `src/` for significant functionality
2. Update `requirements.txt` if new dependencies are needed
3. Document the feature in this file
4. Ensure accessibility compliance (keyboard navigation, screen reader support)
5. Test speech output and recognition if applicable
The application includes:
The platform supports:
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/github-copilot-talkback-assistant/raw