Development Velocity
Items completed per session, derived from git history and PR merge data.
Product Metrics
What we track and what we plan to track. 2 active, 4 planned.
Page Views
TrackingTrack which pages judges, captains, and organizers visit most frequently.
Google Analytics 4 wired via NEXT_PUBLIC_GA_MEASUREMENT_ID. SPA-aware pageview tracker fires config events on route change. Priority pages: /judge, /captain, /organizer/competition.
User Identity
PlannedAssociate analytics events with roles (judge, captain, organizer) without PII.
Use role + anonymized session ID. Never log CBJ numbers or names to analytics.
Feature Adoption
PlannedMeasure which features are actually used during live competitions.
Key features: comment cards (toggle rate), correction requests (frequency), score audit (organizer usage)
Engagement Depth
PlannedTrack session duration and actions per session by role.
Judges should have short, focused sessions. Organizers need sustained attention. Captain sessions are event-driven.
Performance Metrics
TrackingCore Web Vitals (LCP, FID, CLS) and API response times.
Health endpoint measures DB latency (target <500ms). Vercel Analytics provides CWV. Server actions need timing instrumentation.
Click Tracking
PlannedTrack key interaction patterns: score picker usage, navigation flows, error states.
Priority: score picker (which scores are most common), correction request flow (abandon rate), category advancement (time between)
Key Questions
Product questions we need to answer, how to measure them, and where the data lives.
Product Questions
How long does it take a judge to complete one box?
highMeasure time from ScoreCard creation (appearance phase) to final submission. Target: 2-4 minutes per box. Break down by: appearance-only phase vs taste/texture phase.
What percentage of scores require correction?
highCount CorrectionRequests / total ScoreCards per competition. Healthy rate is <5%. Higher suggests UI confusion or unclear scoring criteria. Track by category (Brisket historically hardest).
Are comment cards useful or friction?
mediumCompare competition outcomes with commentCardsEnabled on vs off. Measure: judge session duration, comment card completion rate, organizer toggle frequency. If >30% judges skip optional fields, simplify the form.
How many judges struggle with the setup flow?
highTrack progression through the 4 setup phases (not-registered → awaiting-table → pick-seat → ready). Measure: time in each phase, drop-off between phases, support requests. Target: <3 minutes from login to ready.
Is the organizer dashboard efficient?
mediumMeasure clicks-to-complete for key organizer tasks: advancing a category, reviewing box distribution, checking results. Compare time-on-page between the old single-page layout and the new tabbed structure.
What's the peak concurrent user count?
lowDuring active judging, all 24 judges + 4 captains + 1 organizer are concurrent (29 users). Monitor: API response times under load, DB connection pool usage, any 500 errors during scoring rounds.
Analytics Platform
Google Analytics 4 — Active
Pageviews and sessions tracked on both the app (app.thejudgetool.com) and the marketing site (www.thejudgetool.com). App uses a Suspense-wrappedPageviewTracker so client-side navigations fireconfig events. No PII — CBJ numbers and names never leave the server.
Complementary Platforms (not yet added)
Event funnels, session replay, feature flags. Adds what GA4 doesn't cover.
Core Web Vitals — zero-config add-on if we want CWV tracking.
Alternative to PostHog — stronger funnel analysis, generous free tier.