Expert guidance for working with YSBA Live baseball standings application. Helps with development, debugging, API endpoints, scraping logic, and multi-division architecture.
Expert guidance for working with the YSBA (York Simcoe Baseball Association) Live Standings and Schedules application.
This skill provides comprehensive guidance for developing and debugging the YSBA Live application - a Node.js web application that displays real-time baseball standings using Express, vanilla JavaScript, and pre-generated JSON files from GitHub Actions.
**Server**: Express application (`server-optimized.js`) serving cached JSON files only - no in-app scraping
**Background Worker**: GitHub Actions-based scraper (`scripts/github-action-scraper.js`) running every 30 minutes
**Scraping Engine**: Modular Puppeteer-based system in `src/scraper/` (scraper.js, formatter.js, writer.js)
**Frontend**: Vanilla JavaScript (`public/js/app.js`) with standings display and team schedules
**Configuration**: Multi-division system centralized in `config.js` (Rep divisions: 8U-22U+Senior with A/AA/AAA tiers; Select divisions: 9U/11U/13U/15U)
Run the appropriate development command:
Use curl or browser to verify endpoints:
```bash
curl "http://localhost:3000/api/status"
curl "http://localhost:3000/api/divisions?filterEmpty=true"
curl "http://localhost:3000/api/standings?division=9U-select&tier=all-tiers"
curl "http://localhost:3000/api/team/TEAMCODE/schedule?division=13U-rep&tier=A"
curl "http://localhost:3000/api/stories"
```
Enable debug mode in browser console:
```javascript
localStorage.setItem('debugMode', 'true')
```
**a)** Open `config.js` and add division to appropriate section (rep or select)
**b)** Test the new division:
**c)** Verify scraping works via background worker test
**a)** Update `src/scraper/scraper.js` with new Puppeteer selectors or logic
**b)** Test locally:
```bash
npm run test-worker
npm run test-scraper
```
**c)** Verify generated JSON files in `data/` and `public/` directories
**d)** Deploy changes to trigger GitHub Actions scraping
**a)** Main logic: Edit `public/js/app.js`
**b)** Styles: Edit `public/css/styles.css`
**c)** Update cache version:
```bash
npm run build
```
**d)** Test changes locally before deploying
**a)** Modify `email-service.js` for email notification logic
**b)** Test subscriber data:
**c)** Note: GitHub Gist backup happens automatically on deployment (primary storage with 4KB env var fallback)
**a)** Modify `ai-story-service.js` for OpenAI-powered story generation logic
**b)** Test manually:
```bash
curl -X POST "http://localhost:3000/api/stories/generate"
```
**c)** Stories auto-generate when background worker detects story-worthy events (first wins, hot streaks, undefeated runs, etc.)
| File | Purpose |
|------|---------|
| `server-optimized.js` | Main Express server, API endpoints, serves cached JSON only |
| `scripts/github-action-scraper.js` | GitHub Actions worker orchestrating scraping, emails, stories |
| `src/scraper/scraper.js` | Modular Puppeteer scraping engine |
| `src/scraper/formatter.js` | Data formatting and structuring for JSON output |
| `src/scraper/writer.js` | File writing operations for data/ and public/ directories |
| `config.js` | Multi-division configuration (all divisions, tiers, settings) |
| `public/js/app.js` | Frontend application logic (standings, schedules, interactions) |
| `email-service.js` | SendGrid email notifications with GitHub Gist backup |
| `ai-story-service.js` | OpenAI-powered homepage story generation |
Three-level cache system (30-minute duration):
1. **Division cache**: `cachedDataByDivision[division-tier]` - Main standings
2. **Team schedule cache**: `teamScheduleCache[teamCode-division-tier]` - Individual teams
3. **Comprehensive schedule cache**: `allGamesCache[schedule-division-tier]` - All games for background loading
**Standings not updating**: Check GitHub Actions logs, verify worker completed successfully, inspect generated JSON files in `data/` and `public/`
**Scraping errors**: Test locally with `npm run test-worker`, check Puppeteer selectors in `src/scraper/scraper.js`, verify YSBA website HTML structure hasn't changed
**Email notifications failing**: Verify SendGrid API key, check GitHub Gist accessibility, inspect subscriber storage mechanism
**Stories not generating**: Check OpenAI API key, verify story-worthy events occurred, test manual generation endpoint
**Cache issues**: Clear cache by restarting server, verify 30-minute cache duration is appropriate, check cache key construction
1. Open `config.js`
2. Add to repDivisions array: `{ division: '18U-rep', tiers: ['A', 'AA', 'AAA'] }`
3. Test: `curl "http://localhost:3000/api/standings?division=18U-rep&tier=A"`
4. Deploy and verify GitHub Actions scrapes new division
1. Open `src/scraper/scraper.js`
2. Find win/loss column selector
3. Update selector: `const wlCell = row.querySelector('td:nth-child(4)')`
4. Test: `npm run test-worker`
5. Verify JSON output format
1. Open `email-service.js`
2. Find change detection logic
3. Modify threshold criteria (position changes, win totals, etc.)
4. Deploy and monitor next GitHub Actions run
Leave a review
No reviews yet. Be the first to review this skill!
# Download SKILL.md from killerskills.ai/api/skills/ysba-live-development-assistant/raw