Learning ObjectivesBy the end of this chapter, you will be able to:
- Design effective analytics team structures and define roles
- Hire and develop analytics talent in football organizations
- Prioritize projects and allocate resources strategically
- Implement agile methodologies for analytics workflows
- Establish code quality standards and documentation practices
- Build collaborative relationships with football operations and coaching
- Manage analytics tools and infrastructure
- Develop a culture of continuous improvement and learning
- Create career development paths for analytics professionals
- Navigate the unique challenges of sports analytics leadership
Introduction
Building and managing an analytics department in football is fundamentally different from leading analytics teams in other industries. The high-stakes, time-sensitive nature of football—where decisions must be made quickly, seasons are compressed, and organizational culture is deeply rooted in tradition—creates unique challenges for analytics leaders.
Success requires more than technical expertise. Analytics directors must balance rigorous methodology with practical application, build trust with skeptical stakeholders, deliver insights under intense time pressure, and demonstrate value in ways that resonate with both front office executives and coaching staff. They must navigate a complex organizational landscape where analytics is often viewed as disruptive to established practices.
Understanding the organizational structure of an analytics department is crucial for both aspiring analytics professionals and decision-makers considering building or expanding their analytics capabilities. Unlike traditional corporate analytics teams that might focus on marketing optimization or financial forecasting with relatively stable timelines, football analytics teams operate in an environment where the entire competitive landscape resets annually through the draft, urgency peaks during the season when games occur weekly, and the margin between success and failure is measured in fractions of expected points.
This chapter provides a comprehensive guide to running a football analytics department. We'll explore team structures appropriate for different organizational sizes and budgets, hiring strategies that balance technical skills with football acumen and communication abilities, project management frameworks adapted to football's unique rhythms, best practices for code quality and documentation that ensure institutional knowledge survives personnel changes, collaboration techniques that build trust across departments, infrastructure management that supports both routine reporting and experimental research, and career development paths that help retain top talent in a competitive market.
What Makes Football Analytics Leadership Different?
Football analytics directors face unique challenges that distinguish this role from analytics leadership in other domains: - **Compressed timelines**: Decisions needed in hours or days, not weeks or months. During the season, analysis requested Monday morning may need to be delivered Tuesday afternoon to inform that week's game plan. - **High visibility**: Mistakes are public and consequential. A flawed fourth-down recommendation broadcast to millions can damage credibility across the organization. - **Cultural resistance**: Analytics often challenges traditional football wisdom accumulated over decades. Success requires changing minds while respecting institutional knowledge. - **Seasonal cycles**: Intense periods (draft preparation, in-season game planning) alternate with strategic planning phases, requiring flexible resource allocation. - **Stakeholder diversity**: Must communicate effectively with GMs focused on roster construction, coaches concerned with weekly preparation, scouts evaluating players, and owners tracking competitive advantage. - **Limited resources**: Smaller teams than corporate analytics departments, often 1-8 people compared to dozens or hundreds in tech companies. - **Competition for talent**: Technology companies and financial firms offer higher salaries for similar technical skills, making retention challenging. - **Turnover impact**: Regime changes when coaches or GMs are fired can eliminate entire analytics departments, creating job insecurity unique to sports.Analytics Team Structures
Effective analytics departments balance specialization with flexibility. The optimal structure depends on organizational size, budget, and strategic priorities. There is no one-size-fits-all approach—a team structure that works well for a large-market franchise with substantial resources may be impractical for a smaller organization operating with budget constraints.
The key tension in structuring analytics teams is between depth and breadth. Smaller teams need generalists who can handle multiple responsibilities—from data engineering to statistical modeling to stakeholder communication. Larger teams can afford specialists who develop deep expertise in specific domains like player evaluation, game strategy, or machine learning. Both approaches have merits, and the right choice depends on organizational context.
Another critical consideration is integration versus centralization. Should analytics staff be embedded within football operations departments (scouting, coaching, personnel) or centralized in a standalone analytics unit? Embedded analysts build stronger relationships and domain expertise but may lack technical support and methodological rigor. Centralized teams maintain higher technical standards but may struggle to understand operational needs. Many organizations adopt hybrid models, with a core analytics team supporting embedded specialists.
The structures outlined below represent common patterns observed across NFL and college football organizations. However, successful teams often customize these structures based on their specific needs, existing organizational culture, and available talent.
Small Teams (1-3 People)
Small analytics teams are common in budget-constrained organizations, startup franchises, or organizations just beginning to invest in analytics. These teams must be exceptionally efficient, focusing ruthlessly on high-impact projects while maintaining enough flexibility to respond to urgent requests.
Structure: Generalists who handle multiple responsibilities across the analytics lifecycle.
Typical Roles:
- Analytics Director: Strategy, stakeholder management, high-level analysis, methodology decisions, budget oversight, and external representation. This role often reports directly to the GM or head coach.
- Analyst(s): Data engineering (building pipelines, maintaining databases), statistical modeling, visualization, routine reporting, and tool development. Junior analysts may focus more on execution while senior analysts contribute to methodology and stakeholder engagement.
In very small teams (1-2 people), the director often performs significant hands-on analytical work rather than purely managing. This "player-coach" model requires balancing strategic leadership with technical contribution.
Advantages:
- Flexibility and rapid iteration: Small teams can pivot quickly, adapting priorities as organizational needs change without complex coordination.
- Low overhead and coordination costs: Minimal time spent in meetings or coordinating across sub-teams. Everyone knows what everyone else is working on.
- Deep understanding across all projects: Team members develop broad context, understanding how different analytical threads connect and avoiding duplicate work.
- Strong ownership and accountability: With limited staff, each person's contributions are highly visible, creating natural accountability.
- Tight stakeholder relationships: Small teams often work directly with decision-makers, building strong trust and understanding needs deeply.
Challenges:
- Limited bandwidth for complex projects: Ambitious multi-month projects may be impractical when urgent weekly requests dominate the calendar.
- Difficulty maintaining documentation: With constant pressure to deliver, documentation often takes a back seat, creating knowledge management problems.
- Vulnerability to key person dependencies: Single points of failure are common—if the one person who understands the draft model leaves, institutional knowledge evaporates.
- Work-life balance risks: Especially during intense periods (draft prep, playoff races), small teams may face unsustainable workload.
- Limited depth in any single area: Generalists may lack the specialized expertise needed for cutting-edge methodologies or complex statistical challenges.
Strategies for Small Team Success
If you're running or part of a small analytics team: 1. **Automate ruthlessly**: Invest time upfront to automate routine reports and data pipelines. Every hour spent on automation pays dividends throughout the season. 2. **Focus on decisions**: Prioritize projects directly tied to specific decisions (draft picks, free agent signings, fourth down calls) over pure research. 3. **Build simple, robust tools**: Prefer simple models that are easy to maintain and explain over complex models that require specialized expertise. 4. **Document as you go**: Write README files and code comments during development, not retrospectively. Future you will be grateful. 5. **Cultivate relationships early**: In-person meetings and informal conversations build trust more than formal presentations. Invest in relationship capital. 6. **Know when to say no**: Protecting the team from low-value requests is essential. Help stakeholders understand trade-offs.Medium Teams (4-8 People)
Medium-sized teams represent a sweet spot for many organizations, large enough to develop specialized expertise while small enough to maintain coordination and alignment. These teams typically emerge as organizations mature their analytics capabilities and demonstrate value that justifies expanded investment.
Structure: Mix of specialists and generalists with clear domains of responsibility but enough cross-training to provide coverage when needed.
The visualization below illustrates a typical medium-sized team structure, showing reporting relationships and functional specialization:
#| label: fig-team-structure-medium-r
#| fig-cap: "Medium-sized analytics team structure showing reporting relationships and functional specialization. This 6-person team balances domain expertise (player evaluation, game strategy, research) with technical infrastructure (data engineering)."
#| fig-width: 10
#| fig-height: 7
#| message: false
#| warning: false
library(tidyverse)
library(igraph)
library(ggraph)
# Define team structure with reporting relationships
# This represents a common medium-team organizational pattern
# where specialized analysts report to a lead analyst who provides
# technical oversight, while data/analytics engineering operates
# as a separate track supporting the analytical team
team_structure <- tribble(
~from, ~to, ~type,
"Director", "Lead Analyst", "reports_to",
"Director", "Data Engineer", "reports_to",
"Lead Analyst", "Player Eval Analyst", "reports_to",
"Lead Analyst", "Game Strategy Analyst", "reports_to",
"Lead Analyst", "Research Analyst", "reports_to",
"Data Engineer", "Analytics Engineer", "reports_to"
)
# Create directed graph object
# In organizational charts, edges point from manager to direct report
g <- graph_from_data_frame(team_structure, directed = TRUE)
# Create visualization using tree layout
# Tree layout naturally represents hierarchical reporting structures
ggraph(g, layout = 'tree') +
geom_edge_link(
arrow = arrow(length = unit(3, 'mm')),
aes(edge_colour = type),
width = 1.2
) +
geom_node_point(size = 12, color = "#003366", alpha = 0.8) +
geom_node_text(
aes(label = name),
color = "white",
size = 3.5,
fontface = "bold"
) +
scale_edge_colour_manual(values = c("reports_to" = "#003366")) +
theme_void() +
theme(legend.position = "none") +
labs(
title = "Medium Analytics Team Structure (6 people)",
subtitle = "Specialized roles with clear reporting relationships"
)
#| label: fig-team-structure-medium-py
#| fig-cap: "Medium-sized analytics team structure showing reporting relationships - Python implementation demonstrates equivalent organizational structure"
#| fig-width: 10
#| fig-height: 7
#| message: false
#| warning: false
import matplotlib.pyplot as plt
import networkx as nx
# Define team structure edges
# Each tuple represents a reporting relationship: (manager, direct report)
edges = [
("Director", "Lead Analyst"),
("Director", "Data Engineer"),
("Lead Analyst", "Player Eval\nAnalyst"),
("Lead Analyst", "Game Strategy\nAnalyst"),
("Lead Analyst", "Research\nAnalyst"),
("Data Engineer", "Analytics\nEngineer")
]
# Create directed graph
# DiGraph = directed graph where edge direction matters
G = nx.DiGraph()
G.add_edges_from(edges)
# Create hierarchical layout manually
# Positions represent (x, y) coordinates for each node
# Higher y-values = higher in organizational hierarchy
pos = {
"Director": (0, 3),
"Lead Analyst": (-1.5, 2),
"Data Engineer": (1.5, 2),
"Player Eval\nAnalyst": (-2.5, 1),
"Game Strategy\nAnalyst": (-1, 1),
"Research\nAnalyst": (-0.5, 1),
"Analytics\nEngineer": (1.5, 1)
}
# Draw organizational chart
plt.figure(figsize=(12, 8))
nx.draw(
G, pos,
node_color='#003366',
node_size=3000,
with_labels=True,
font_color='white',
font_weight='bold',
font_size=9,
arrows=True,
arrowsize=20,
arrowstyle='->',
edge_color='#003366',
width=2
)
plt.title("Medium Analytics Team Structure (6 people)\nSpecialized roles with clear reporting relationships",
fontsize=14, fontweight='bold', pad=20)
plt.axis('off')
plt.tight_layout()
plt.show()
📊 Visualization Output
The code above generates a visualization. To see the output, run this code in your R or Python environment. The resulting plot will help illustrate the concepts discussed in this section.
Typical Roles in Medium Teams:
-
Analytics Director: Leadership, strategy, stakeholder management, resource allocation, hiring, and representing analytics in organizational decisions. Typically 20% technical work, 80% leadership and communication.
-
Lead Analyst: Senior technical leadership, complex modeling, methodology decisions, code review, and mentoring junior staff. Serves as technical escalation point and ensures analytical rigor across projects. Often 60% technical work, 40% leadership.
-
Data Engineer: Infrastructure, data pipelines, database management, API development, and ensuring data quality. Focuses on making data accessible and reliable for analysts rather than performing analysis themselves.
-
Domain Specialists: Each specialist develops deep expertise in their area:
- Player Evaluation Analyst: Draft models, free agency analysis, player tracking metrics, combine prediction models
- Game Strategy Analyst: Weekly opponent reports, fourth down models, two-point conversion recommendations, play-calling optimization
-
Research Analyst: Experimental methodologies, machine learning development, academic collaborations, long-term projects not tied to immediate decisions
-
Analytics Engineer: Tool development, automation, dashboard creation, Shiny/Streamlit apps, quality assurance, and bridging the gap between data engineering and analysis. This role is particularly valuable for operationalizing analytical insights into tools decision-makers use daily.
Advantages:
- Specialization improves depth of analysis: Domain experts develop nuanced understanding that generalists may miss. The game strategy analyst becomes an expert in situational football; the player evaluation analyst masters scouting metrics.
- Better coverage of diverse analytics needs: Multiple projects can progress simultaneously rather than queuing behind a bottleneck.
- Mentorship opportunities for junior staff: Senior analysts and engineers can coach less experienced team members, accelerating development.
- Redundancy reduces key person risk: If one analyst leaves, others can cover their domain while a replacement is found, preventing catastrophic knowledge loss.
- Career path visibility: Team members can see progression from Analyst to Senior/Lead roles, improving retention.
Challenges:
- Coordination overhead increases: More people means more time spent in meetings, status updates, and ensuring alignment.
- Need for clear project management: Without systems like Kanban boards or sprint planning, work can become chaotic.
- Risk of silos between specialists: Player evaluation and game strategy analysts may duplicate work or miss opportunities to share insights if communication isn't intentional.
- Consistency of standards: Different analysts may adopt different coding styles, documentation practices, or analytical approaches without active management.
- Resource allocation conflicts: When both player evaluation and game strategy need urgent analysis during the same week, the director must prioritize.
The Lead Analyst Role is Critical
In medium-sized teams, the Lead Analyst role serves as the linchpin between technical work and organizational leadership. This person must: - **Maintain technical credibility**: Actively coding and modeling to understand team challenges firsthand - **Provide technical mentorship**: Reviewing code, guiding methodology, developing junior staff - **Coordinate across domains**: Ensuring player evaluation and game strategy insights inform each other - **Shield the team from chaos**: Filtering requests and protecting against scope creep - **Represent technical perspective to leadership**: Translating technical constraints into business language Organizations often promote their best analyst to this role, but the skills that make someone an excellent analyst (deep focus, technical perfectionism) differ from those needed for leadership (delegation, big-picture thinking, stakeholder management). Providing leadership training is essential.Large Teams (9+ People)
Large analytics teams are found in organizations that have made substantial long-term investments in analytics, typically larger-market franchises or organizations whose leadership strongly values data-driven decision-making. These teams can pursue ambitious projects, maintain robust coverage across all analytical needs, and develop cutting-edge methodologies.
Structure: Specialized sub-teams organized by domain or function, each with internal hierarchy.
Typical Organization:
Large teams often structure themselves into functional pods or squads, each operating semi-independently while coordinating on shared infrastructure and methodologies:
-
Player Evaluation Team (2-3 people)
- Focus: Draft analytics, free agency evaluation, contract analysis, trade value assessment
- Typical Roles: Lead analyst + 1-2 specialists focused on college and professional player evaluation respectively
- Key Deliverables: Pre-draft player rankings, free agent target lists, contract value recommendations, trade evaluations
- Seasonal Cycle: Intense during draft prep (January-April) and free agency (March), lighter during season -
Game Strategy Team (2-3 people)
- Focus: Weekly opponent preparation, situational decision support, play-calling optimization, in-game strategy
- Typical Roles: Lead analyst + 1-2 specialists for offense and defense
- Key Deliverables: Weekly opponent reports, fourth down recommendations, two-point conversion models, optimal play-calling distributions
- Seasonal Cycle: Intense during season (September-February), lighter in offseason -
Data Engineering Team (2-3 people)
- Focus: Infrastructure and pipelines, database management, API development, data quality assurance
- Typical Roles: Lead engineer + 1-2 engineers for different systems (e.g., one focused on play-by-play data, another on tracking data)
- Key Deliverables: Reliable data pipelines, database optimization, internal APIs, automated data validation
- Seasonal Cycle: Relatively stable year-round, with spikes when new data sources are integrated -
Research & Innovation Team (1-2 people)
- Focus: New methodologies, machine learning and AI development, long-term experimental projects, academic collaborations
- Typical Roles: Principal analyst or senior researcher, possibly with PhD-level expertise
- Key Deliverables: Novel analytical approaches, proof-of-concept models, published research, competitive intelligence on emerging methods
- Seasonal Cycle: Protected from urgent requests to maintain focus on long-term innovation
Some organizations add additional specialized teams:
-
Player Health & Performance Team (1-2 people, if organization invests heavily in sports science)
- Injury risk modeling, load management, return-to-play predictions, performance optimization
- Often collaborates closely with medical and strength & conditioning staff -
Salary Cap & Roster Construction Team (1-2 people, in organizations with complex financial planning)
- Contract structuring optimization, roster construction modeling, salary cap forecasting, trade and release analysis
Advantages:
- Deep expertise in each domain: Specialists can develop mastery that generalists can't match, pursuing complex projects that require sustained focus.
- Capacity for ambitious projects: Can tackle machine learning pipelines, proprietary tracking data analysis, or multi-month research initiatives.
- Robust coverage across all needs: Multiple concurrent projects across different domains without sacrificing quality.
- Knowledge redundancy: Multiple people understand each domain, protecting against departures.
- Recruitment and development: Can hire specialists with narrow expertise and junior analysts who develop within specific domains.
- Competitive advantage: Sufficient scale to develop proprietary methodologies that smaller competitors can't replicate.
Challenges:
- High coordination costs: Significant time spent in cross-team meetings, alignment discussions, and information sharing.
- Potential for duplicate work: Without clear communication, different teams may solve similar problems independently.
- Complex communication requirements: Ensuring insights from research inform game strategy, and game strategy informs player evaluation, requires active coordination.
- Bureaucracy risk: Formal processes necessary for coordination can slow decision-making and reduce agility.
- Budget and headcount justification: Larger teams face more scrutiny on ROI and must continuously demonstrate value.
- Cultural fragmentation: Sub-teams may develop different cultures, standards, or priorities without intentional alignment.
- Management overhead: The director role becomes primarily managerial rather than technical, requiring a different skill set.
Sizing Your Analytics Team
General guidelines for determining appropriate team size: - **Small market/budget**: 1-3 people focusing on highest-impact areas. Prioritize game strategy (in-season value) and player evaluation (draft/FA value). Accept that some areas (research, specialized tools) won't be covered. - **Medium organization**: 4-8 people with domain specialization. Can cover game strategy, player evaluation, and data engineering with some research capacity. Most organizations operate effectively at this scale. - **Large organization**: 9+ people with sub-teams by function. Justifiable when: - Leadership strongly values analytics and funds appropriately - Organization can articulate specific use cases for specialized roles - Existing team is consistently oversubscribed with valuable projects - Recruiting pipeline can support hiring specialists - **Ratio to consider**: Approximately 1 analyst per 2-3 decision-makers who regularly need analytics support. An organization with a GM, two coordinators, and four position coaches might need 2-3 analysts to provide adequate coverage. - **Growth strategy**: Start small and prove value before expanding. It's easier to justify adding headcount when existing team is demonstrably overloaded with high-value work than to build a large team speculatively.Avoid Premature Scaling
A common mistake is building a large analytics team before establishing product-market fit—demonstrating that analytics can influence decisions and provide value. Large teams without clear deliverables and stakeholder buy-in face: - **Budget cuts**: When value isn't obvious, analytics is often first to face budget pressure - **Unclear priorities**: Without stakeholder demand, teams fill time with projects that don't matter - **Talent atrophy**: Good analysts leave when their work doesn't impact decisions - **Political vulnerability**: Large expensive teams without defenders are eliminated during regime changes Better approach: Start with 1-2 excellent people, demonstrate clear value, then expand incrementally based on demand.Hiring Analytics Talent
Building an effective team starts with hiring the right people. Football analytics requires a unique blend of technical skills (statistics, programming, data engineering), domain knowledge (football strategy, rules, historical context), and soft skills (communication, collaboration, resilience). Finding candidates who excel in all three areas is rare, making hiring one of the most important and challenging aspects of building an analytics department.
The hiring market for football analytics professionals is highly competitive. Candidates with strong technical skills have lucrative options in technology, finance, and consulting. Those with football knowledge may lack statistical rigor. Those with both are in high demand and short supply. Successful hiring requires understanding which skills are essential versus learnable, crafting compelling opportunities beyond just salary, and running efficient processes that respect candidates' time.
Key Competencies
When evaluating candidates, assess capabilities across three dimensions: technical skills (can they do the analytical work?), domain knowledge (do they understand football?), and communication skills (can they influence decisions?). The relative importance of each dimension depends on role level and organizational context.
Technical Skills
Technical competencies form the foundation of analytical work. While specific tools and methods can be learned, certain fundamentals are difficult to teach without significant time investment.
Essential Skills (Required for all analytical roles):
- Programming: Proficiency in R or Python, including:
- Data manipulation (dplyr/tidyverse in R, pandas in Python)
- Writing functions and organizing code into reusable modules
- Debugging and troubleshooting
- Reading and understanding others' code
-
Version control basics (Git, GitHub)
-
Statistics: Solid foundation in:
- Regression analysis (linear, logistic)
- Hypothesis testing and statistical significance
- Probability distributions and uncertainty quantification
- Experimental design and causal inference concepts
-
Recognizing when methods are being applied inappropriately
-
Data manipulation: Ability to:
- Write efficient SQL queries for data extraction
- Perform joins, aggregations, and window functions
- Clean messy data and handle missing values
- Reshape data between wide and long formats
-
Validate data quality and identify anomalies
-
Visualization: Creating clear, effective graphics using:
- ggplot2 (R) or matplotlib/seaborn (Python)
- Principles of visual design (color, scale, labeling)
- Choosing appropriate chart types for different data
- Creating publication-quality graphics
-
Interactive visualizations (Shiny, Plotly, Streamlit)
-
Version control: Understanding of:
- Git fundamentals (commit, push, pull, branch, merge)
- GitHub workflows and collaboration
- When and how to commit code
- Writing meaningful commit messages
- Basic branching strategies
Desirable Skills (Valuable but can be learned on the job):
- Machine learning: Experience with:
- Classification and regression algorithms (random forests, XGBoost, neural networks)
- Feature engineering and selection
- Cross-validation and hyperparameter tuning
- Model evaluation and performance metrics
-
Understanding when ML is appropriate versus overkill
-
Bayesian methods: Familiarity with:
- Hierarchical models and partial pooling
- Prior specification and sensitivity analysis
- Posterior inference and uncertainty quantification
- Bayesian workflow (Stan, PyMC, JAGS)
-
Communicating Bayesian results to non-technical audiences
-
Optimization: Understanding of:
- Linear programming and constraint satisfaction
- Roster construction optimization
- Resource allocation problems
- Understanding trade-offs and Pareto frontiers
-
Translating business problems into mathematical optimization
-
Big data tools: Experience with:
- Distributed computing (Spark, Dask)
- Working with datasets too large for memory
- Optimizing code performance
-
Parallel processing strategies
-
Cloud platforms: Familiarity with:
- AWS, Azure, or Google Cloud Platform
- Cloud storage (S3, Azure Blob Storage)
- Cloud computing (EC2, virtual machines)
- Managed databases and services
- Cost management and optimization
Technical Skills: Depth vs. Breadth
For junior roles, prioritize: - **Depth** in core skills (programming, statistics, data manipulation) - Demonstrated ability to learn independently - Evidence of completed projects showing technical competence For senior roles, prioritize: - **Breadth** across multiple technical areas - Experience choosing appropriate methods for different problems - Track record of delivering complete projects end-to-end - Ability to mentor and review others' technical work Don't expect candidates to be experts in everything. Someone strong in traditional statistics may be weak in machine learning, or vice versa. Teams benefit from diverse technical backgrounds.Domain Knowledge
Football knowledge determines whether analysts can ask the right questions, interpret results appropriately, and communicate effectively with coaches and scouts. However, domain knowledge is generally more learnable than technical skills—a statistically strong candidate can learn football, but teaching rigorous statistics to someone without that background is much harder.
Football Understanding (Assess through discussion and take-home exercises):
- Rules and strategy: Understanding of:
- Basic rules and game flow
- Offensive and defensive formations
- Down-and-distance situations and strategic implications
- Clock management and end-of-game scenarios
- Scoring systems and point values
-
Rule changes and their strategic impact
-
Positional roles and responsibilities: Knowledge of:
- What each position does and how they're evaluated
- Position groups and their interactions
- Scheme differences (3-4 vs. 4-3 defense, zone vs. man coverage, zone vs. gap run blocking)
-
How positional value varies (why quarterbacks are valued higher than running backs)
-
Personnel groupings and formations: Familiarity with:
- Common offensive personnel (11, 12, 21, 10 personnel)
- Defensive personnel packages (base, nickel, dime)
- Formation tendencies and matchup implications
-
How personnel grouping affects play type probabilities
-
Game situations and contexts: Understanding that:
- Third-and-short differs from third-and-long strategically
- Early game vs. late game strategies shift based on score
- Weather conditions affect play-calling and performance
- Home field advantage varies by team and context
-
Opponent quality and specific matchups matter
-
Historical trends and evolution: Awareness of:
- How the game has changed over time (increased passing, rule changes favoring offense)
- Why historical statistics need era adjustments
- Evolution of analytics in football (from yard-based to EPA-based evaluation)
- Current strategic trends (RPOs, pre-snap motion, two-high safety defenses)
Analytics Frameworks (Can be taught but demonstrated familiarity is valuable):
- Core concepts: Understanding of:
- Expected Points Added (EPA) and its advantages over yards
- Win Probability models and situational decision-making
- Success Rate and efficiency metrics
- DVOA, PFF grades, and other public metrics
-
Limitations and assumptions of each framework
-
Player evaluation methodologies: Familiarity with:
- Combine metrics and athletic testing
- Production-based models for draft prospects
- Aging curves and career trajectory patterns
- Positional value frameworks
-
How to separate player talent from scheme and teammates
-
Draft models and processes: Knowledge of:
- Historical draft value curves
- Trade value charts and their limitations
- Positional draft scarcity
- College-to-pro projection challenges
-
Combine and performance data integration
-
Salary cap mechanics: Understanding of:
- How NFL salary cap works (dead money, guarantees, prorations)
- Contract structuring strategies
- Free agency market dynamics
- Rookie wage scale and its implications
-
Cap strategies for contending vs. rebuilding teams
-
League-wide trends: Awareness of:
- How league-average pass/run ratios have evolved
- Offensive vs. defensive spending patterns
- Positional value shifts in draft and free agency
- Schematic trends (spread offense adoption, two-high safety defenses)
- How rules changes affect strategy
Assessing Football Knowledge in Interviews
Rather than testing trivia, assess analytical thinking about football: **Good questions:** - "How would you design a metric to evaluate offensive line performance?" - "What makes quarterback evaluation more difficult than other positions?" - "Why has pass rate increased over the past decade?" - "How would you adjust for schedule strength when comparing teams?" **Poor questions:** - "Who won Super Bowl XVII?" (trivia, not analytical thinking) - "What defense does this team run?" (Google-able facts) - "Name all the positions on a football field" (memorization) Focus on whether the candidate can reason analytically about football problems, not whether they've memorized facts.Communication Skills
Technical brilliance matters little if insights can't influence decisions. Communication skills often distinguish good analysts from great ones, yet they're frequently underweighted in hiring processes that overemphasize technical assessments.
Critical Communication Abilities:
- Translate technical findings to non-technical audiences: The ability to:
- Explain statistical concepts without jargon
- Use analogies and examples that resonate with coaches and scouts
- Emphasize insights over methodology
- Calibrate detail level based on audience
-
Anticipate questions and preempt confusion
-
Present insights clearly under time pressure: Competence in:
- Organizing thoughts quickly when asked unexpected questions
- Delivering concise verbal updates (e.g., "What does the data say about this player?" asked in a hallway)
- Creating executive summaries that capture key points
- Handling interruptions and pivoting mid-presentation
-
Maintaining composure when challenged or questioned
-
Build relationships with skeptical stakeholders: Skills in:
- Active listening and understanding others' perspectives
- Empathy for traditional football knowledge and respecting expertise
- Finding common ground and building trust incrementally
- Handling disagreement professionally
- Being right without being arrogant
-
Admitting uncertainty and mistakes
-
Write concise, actionable reports: Ability to:
- Lead with conclusions rather than burying them in methodology
- Use clear, direct language avoiding academic writing conventions
- Structure documents for skimming (headings, bullets, visual hierarchy)
- Provide specific recommendations, not just observations
-
Tailor writing style to audience (GM vs. coach vs. scout)
-
Visualize complex information effectively: Talent for:
- Choosing appropriate chart types for data and message
- Using color and design principles to highlight insights
- Avoiding cluttered, confusing graphics
- Creating visualizations that stand alone without extensive explanation
- Iterating based on feedback to improve clarity
Communication Skills Are Hard to Teach
Technical skills can be developed through coursework, practice, and mentorship. Communication skills—particularly presenting under pressure and influencing skeptical audiences—are much harder to teach and slower to develop. When hiring, seriously weight: - **Evidence of teaching or presenting** (TA experience, conference talks, blog posts) - **Performance in interview presentations** (clarity, organization, response to questions) - **Writing samples** (clarity, conciseness, ability to explain complex ideas) - **References specifically asked about communication** ("How effective is this person at presenting to non-technical audiences?") A candidate who is 80th percentile technically and 95th percentile in communication will likely outperform someone 95th percentile technically and 60th percentile in communication, especially in senior roles.Job Descriptions
Well-crafted job descriptions attract qualified candidates while setting clear expectations. The examples below represent common roles with realistic responsibilities and qualifications. Customize based on your organization's specific needs and constraints.
These job descriptions illustrate progression from entry-level execution through senior technical leadership to organizational leadership. Notice how:
- Responsibilities shift from task execution (entry-level) to project ownership (senior) to strategic direction (director)
- Required experience increases while maintaining core technical competency expectations
- Communication and leadership weight increases at higher levels
- Salary ranges reflect both experience and market dynamics (directors must compete with corporate executive salaries)
Customizing Job Descriptions
Adapt these templates based on your specific context: **For smaller markets or budget constraints:** - Consolidate roles (entry-level analyst may perform both analysis and some data engineering) - Adjust salary ranges based on cost-of-living and budget reality - Emphasize growth opportunity and impact over compensation - Highlight unique aspects (working with specific coaches, special data access) **For organizations with established analytics:** - Add more specific technical requirements (experience with specific tools or methods you use) - Emphasize cultural fit and collaboration skills - Require demonstrated success in similar environments - Focus on specialized expertise over generalist capabilities **For startup analytics functions:** - Emphasize entrepreneurial mindset and comfort with ambiguity - Look for generalists over specialists - Weight relationship-building and stakeholder management heavily - Seek candidates excited by building from scratch rather than joining established teamsInterview Process
The interview process serves dual purposes: evaluating candidates' fit for the role and giving candidates insight into the organization, team, and role. An effective process is rigorous enough to assess key competencies while respectful of candidates' time and providing decision-relevant information.
The typical analytics hiring process involves multiple stages, each assessing different competencies:
1. Technical Assessment (2-4 hours take-home)
Provide candidates with a realistic football analytics problem using actual data. The assessment should:
Design Principles:
- Use real football data (play-by-play from nflfastR or similar) so candidates can demonstrate domain knowledge alongside technical skills
- Ask open-ended questions allowing multiple valid approaches rather than single "correct" answers
- Emphasize problem-solving approach over getting the exact right answer—how candidates think matters more than whether they happen to choose the optimal method
- Allow access to documentation and resources like the real job. Ban Google and you're testing memorization, not problem-solving.
- Scope appropriately: 2-4 hours for entry-level, potentially longer for senior roles with complex modeling challenges
- Provide clear evaluation criteria so candidates know what you're assessing
Example Assessment Prompts:
Entry-Level:
"Using 2023 NFL play-by-play data, analyze fourth-down decision-making. Identify situations where teams' actual decisions (go, kick, punt) align or conflict with what the data suggests. Create a visualization and brief written summary (1-2 pages) presenting your findings and recommendations."
Senior-Level:
"Build a model predicting wide receiver success in the NFL using college production and combine data. Define 'success' appropriately, select relevant features, develop a predictive model, validate performance, and present findings. Deliver code, documentation, and a 3-5 page report explaining your methodology, results, and limitations."
What to Assess:
- Code quality: Readable, organized, documented, following best practices
- Statistical rigor: Appropriate methods, understanding of assumptions, avoiding common pitfalls
- Communication: Clear explanations, effective visualizations, insight over methodology
- Football intuition: Do findings make football sense? Are counterintuitive results explored?
- Practical thinking: Acknowledging limitations, uncertainty quantification, actionable recommendations
2. Presentation (30-45 minutes: 20-25 min presentation + 20 min Q&A)
Candidates present their technical assessment to a mixed audience including analytics team members (assessing technical approach) and non-technical stakeholders like coaches or scouts (assessing communication).
Structure:
- Candidate presents findings (15-20 minutes): methodology, results, recommendations
- Technical questions (10 minutes): Analytics team probes methodology, asks about alternative approaches, explores edge cases
- Applied questions (10 minutes): Football operations or coaching staff ask about practical application and football implications
What to Assess:
- Presentation skills: Organization, clarity, pacing, slide design
- Audience adaptation: Can they explain the same concept at different detail levels?
- Handling questions: Composure under pressure, admitting uncertainty when appropriate, thinking on their feet
- Football communication: Do they speak "football" or just "statistics"? Can coaches understand them?
Include Non-Technical Evaluators
Always include at least one non-technical stakeholder (scout, coach, football operations staff) in the presentation interview. Their assessment of "Can I work with this person?" and "Do they understand football?" is as important as the analytics team's technical evaluation. If the scout says "I have no idea what this person was talking about" or the coach says "They don't seem to get how football actually works," that's critical feedback regardless of technical brilliance.3. Football Knowledge Discussion (30-60 minutes)
A semi-structured conversation exploring football understanding, analytical thinking about football problems, and cultural fit.
Topics to Explore:
- Discuss their football background: How did they get interested in football analytics? What aspects of the game fascinate them?
- Review public work: If they have a blog, Twitter presence, or GitHub portfolio, discuss specific projects. What motivated this analysis? What did they learn?
- Explore football philosophy: How has analytics changed the game? What do they think traditional scouts/coaches get right that analytics misses? Where do they think analytics has the most room for growth?
- Pose hypothetical scenarios: "If you could add any one piece of data we don't currently have, what would it be and why?" or "How would you approach evaluating [specific position]?"
- Assess cultural fit: What frustrates them? How do they handle disagreement? What kind of work environment helps them thrive?
What to Assess:
- Genuine football passion (vs. just wanting a job in sports)
- Analytical thinking about football (not just trivia knowledge)
- Intellectual humility (acknowledging what they don't know)
- Cultural alignment (will they fit with team and organization?)
- Coachability (receptive to feedback and new perspectives)
4. Behavioral Interviews (Multiple rounds with different stakeholders)
Structured interviews using behavioral questions to assess competencies like problem-solving, collaboration, resilience, and learning mindset.
Sample Questions by Competency:
Problem-Solving Under Pressure:
- "Describe a time you had to deliver analysis under a tight deadline. How did you approach it?"
- "Tell me about a time your analysis led to an unexpected or counterintuitive finding. What did you do?"
Handling Disagreement/Pushback:
- "Describe a situation where a stakeholder disagreed with your recommendation. How did you handle it?"
- "Tell me about a time you were wrong about something analytical. How did you discover and address it?"
Collaboration and Teamwork:
- "Describe your experience working with people who have different expertise than you. How did you ensure effective collaboration?"
- "Tell me about a time you had to explain a complex technical concept to a non-technical audience. What approach did you take?"
Learning and Growth:
- "What's the most valuable piece of feedback you've received? How did you act on it?"
- "Describe a time you had to learn a new technical skill or method quickly. How did you approach it?"
Interview Panel Should Include:
- Analytics team members: Assessing technical collaboration and team fit
- Football operations staff: Assessing cross-functional collaboration potential
- People manager: Assessing coachability, professionalism, career goals
- Potential skip-level manager: Assessing long-term potential and strategic thinking
5. Practical/Cultural Fit Considerations
Beyond formal interviews, assess:
- Reference checks: Speak with former managers and colleagues, asking specifically about communication skills, collaboration, resilience under pressure, and technical strengths/weaknesses
- Work samples beyond assessment: Review blog posts, GitHub repositories, conference presentations, Twitter analysis threads to understand public-facing work quality
- Logistics and expectations: Discuss salary expectations, start date, relocation requirements, work arrangements early to avoid surprises
- Sell the opportunity: Help candidates understand why they should choose your organization—access to decision-makers, interesting problems, team culture, growth opportunities
Common Hiring Mistakes
Avoid these pitfalls that undermine hiring effectiveness: **Over-Emphasizing Football Fandom:** - Hiring the biggest fan over the best analyst often backfires. Passion for football matters, but competence matters more. Someone who became interested in football through analytics may outperform a lifelong fan with weaker technical skills. **Narrow Technical Requirements:** - Requiring expertise in specific tools (e.g., "must know XGBoost") over general competency (e.g., "strong machine learning background") excludes great candidates who could learn your tools quickly. **Ignoring Communication Skills:** - The brilliant analyst who can't explain findings to coaches provides limited value. Communication is a core competency, not a "nice to have." **Unrealistic Technical Assessments:** - Asking candidates to spend 20 hours on take-home assignments or solve complex problems in 1-hour live coding sessions often drives away strong candidates with other options. Respect their time. **Not Testing Real-World Skills:** - Leetcode-style algorithm questions or whiteboard coding may assess computer science knowledge but don't predict success at football analytics work involving messy data, ambiguous problems, and stakeholder communication. **Moving Too Slowly:** - Top candidates have multiple opportunities. If your process takes 8 weeks and 7 interviews, they'll accept other offers. Balance thoroughness with speed. **Hiring Mini-Me's:** - Teams benefit from diversity of backgrounds, methods, and perspectives. Don't just hire people who think like you or have identical backgrounds.Project Prioritization and Resource Allocation
Analytics teams face constant tension between competing demands: urgent requests from coaches preparing for this week's game, important player evaluation projects supporting upcoming draft decisions, infrastructure work that will pay dividends long-term, and exploratory research that might yield breakthrough insights. With limited time and people, effective prioritization becomes essential for maximizing impact.
Without a systematic approach to prioritization, teams fall into common traps: constantly firefighting urgent requests while never completing important projects, saying yes to everything and delivering mediocre work, or pursuing personally interesting research while neglecting stakeholders' needs. A structured prioritization framework helps teams make transparent, defensible decisions about where to invest limited resources.
The framework presented here balances multiple factors: impact on decisions, urgency, resource requirements, and confidence in success. While the specific weights and criteria can be adapted to your organization's priorities, the process of explicit prioritization—rather than implicit, ad-hoc decisions—improves both effectiveness and team morale by creating shared understanding of why some projects happen and others don't.
Prioritization Framework
The following framework scores projects across multiple dimensions, combines them into an overall priority score, and ranks projects accordingly. This creates transparency about prioritization logic and enables productive conversations when stakeholders disagree with priorities.
Before diving into the code implementing this framework, let's understand the conceptual approach:
Core Concept: Each project receives scores (typically 1-10) across multiple dimensions that capture different aspects of value and feasibility. These scores are normalized and combined using weights that reflect organizational priorities. Projects with higher composite scores receive priority for resource allocation.
Key Dimensions:
- Impact: How much does this improve decision-making quality or competitive advantage?
- Urgency: How time-sensitive is this project?
- Resources: How much time and effort is required? (inverse-coded so lower resource needs score higher)
- Confidence: How likely are we to succeed with available data and methods?
Weighting Philosophy: Different organizations may weight these dimensions differently based on their situation:
- Teams in "prove it" mode might weight Impact highest (40-50%) to demonstrate clear value
- Teams during intense season periods might weight Urgency higher (30-40%) to support immediate needs
- Small teams with limited capacity might weight Resources higher (25-30%) to avoid overcommitment
- Research-oriented teams might weight Confidence lower (10-15%), embracing higher-risk experimental projects
The implementation below demonstrates this framework in action with example projects typical of football analytics departments:
#| label: project-prioritization-r
#| message: false
#| warning: false
library(tidyverse)
library(gt)
# Project prioritization function
# This function implements a weighted scoring model for prioritizing
# analytics projects based on multiple criteria
prioritize_projects <- function(projects) {
projects %>%
mutate(
# Normalize scores to 0-10 scale for consistent weighting
# This ensures all dimensions contribute equally to final score
# regardless of their initial scoring range
impact_score = impact / max(impact) * 10,
urgency_score = urgency / max(urgency) * 10,
# Resources are inverse-coded: lower resource needs = higher score
# An 11 - resources transformation ensures low-resource projects
# score higher while maintaining 0-10 scale
resources_score = (11 - resources) / 10 * 10,
confidence_score = confidence / max(confidence) * 10,
# Calculate weighted priority score
# Weights reflect organizational priorities and can be adjusted
# Current weighting:
# - Impact (40%): Primary driver of value
# - Urgency (25%): Time sensitivity matters but doesn't dominate
# - Resources (20%): Feasibility constraint given limited capacity
# - Confidence (15%): Some weight to likelihood of success
priority_score = (
impact_score * 0.40 + # Impact: 40%
urgency_score * 0.25 + # Urgency: 25%
resources_score * 0.20 + # Resources: 20%
confidence_score * 0.15 # Confidence: 15%
)
) %>%
# Sort projects by priority score, highest first
arrange(desc(priority_score)) %>%
# Assign rank numbers for easy reference
mutate(rank = row_number())
}
# Example projects representing typical analytics department workload
# Scores are on 1-10 scale for each dimension
projects <- tribble(
~project, ~impact, ~urgency, ~resources, ~confidence, ~category,
# High-impact, time-sensitive projects
"Draft Model Updates", 9, 8, 6, 8, "Player Evaluation",
"4th Down Decision Tool", 10, 9, 4, 9, "Game Strategy",
"Weekly Opponent Report", 7, 10, 3, 9, "Game Strategy",
# Important but less urgent player evaluation work
"FA Value Model", 8, 5, 7, 6, "Player Evaluation",
"Contract Value Analysis", 8, 8, 6, 7, "Personnel",
# Moderate-impact tactical projects
"Red Zone Play Calling", 6, 6, 5, 7, "Game Strategy",
"Salary Cap Dashboard", 6, 7, 4, 8, "Personnel",
# High-impact but resource-intensive or uncertain projects
"Roster Construction Optimization", 9, 3, 10, 5, "Personnel",
"Injury Risk Modeling", 7, 6, 9, 4, "Player Health",
# Lower priority research projects
"Play Success Prediction", 5, 4, 8, 6, "Research"
)
# Apply prioritization function
prioritized <- prioritize_projects(projects)
# Display results in formatted table
prioritized %>%
select(
rank, project, category,
impact, urgency, resources, confidence,
priority_score
) %>%
gt() %>%
tab_header(
title = "Analytics Project Prioritization",
subtitle = "Weighted scoring model (Impact 40%, Urgency 25%, Resources 20%, Confidence 15%)"
) %>%
cols_label(
rank = "Rank",
project = "Project",
category = "Category",
impact = "Impact",
urgency = "Urgency",
resources = "Resources",
confidence = "Confidence",
priority_score = "Priority Score"
) %>%
# Format priority score to one decimal place
fmt_number(
columns = priority_score,
decimals = 1
) %>%
# Color-code priority scores for visual scanning
# Red = high priority, white = low priority
data_color(
columns = priority_score,
colors = scales::col_numeric(
palette = c("#fee5d9", "#de2d26"),
domain = c(0, 10)
)
) %>%
# Highlight top 3 priorities with gray background
tab_style(
style = cell_fill(color = "#f0f0f0"),
locations = cells_body(rows = rank <= 3)
) %>%
tab_source_note("Score interpretation: 1-10 scale for each dimension")
#| label: project-prioritization-py
#| message: false
#| warning: false
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
def prioritize_projects(df):
"""
Prioritize analytics projects using weighted scoring model
Parameters:
-----------
df : pd.DataFrame
DataFrame with columns: project, impact, urgency, resources, confidence
All scores should be on 1-10 scale
Returns:
--------
pd.DataFrame
Input DataFrame with added normalized scores, priority_score, and rank
"""
# Normalize scores to 0-10 scale
# This handles cases where input scores might use different scales
df['impact_score'] = (df['impact'] / df['impact'].max()) * 10
df['urgency_score'] = (df['urgency'] / df['urgency'].max()) * 10
# Resources inverse-coded: lower resources needed = higher score
df['resources_score'] = ((11 - df['resources']) / 10) * 10
df['confidence_score'] = (df['confidence'] / df['confidence'].max()) * 10
# Calculate weighted priority score
# Weights can be adjusted based on organizational priorities
df['priority_score'] = (
df['impact_score'] * 0.40 + # Impact weight: 40%
df['urgency_score'] * 0.25 + # Urgency weight: 25%
df['resources_score'] * 0.20 + # Resources weight: 20%
df['confidence_score'] * 0.15 # Confidence weight: 15%
)
# Rank projects by priority score (highest = rank 1)
df['rank'] = df['priority_score'].rank(ascending=False, method='first').astype(int)
return df.sort_values('priority_score', ascending=False)
# Example projects with scores across dimensions
projects = pd.DataFrame({
'project': [
'Draft Model Updates',
'4th Down Decision Tool',
'Weekly Opponent Report',
'FA Value Model',
'Red Zone Play Calling',
'Roster Construction Optimization',
'Play Success Prediction',
'Injury Risk Modeling',
'Salary Cap Dashboard',
'Contract Value Analysis'
],
# Impact: How much does this improve decision quality? (1-10)
'impact': [9, 10, 7, 8, 6, 9, 5, 7, 6, 8],
# Urgency: How time-sensitive is this project? (1-10)
'urgency': [8, 9, 10, 5, 6, 3, 4, 6, 7, 8],
# Resources: How many person-weeks required? (1-10, higher = more resources)
'resources': [6, 4, 3, 7, 5, 10, 8, 9, 4, 6],
# Confidence: How likely to succeed? (1-10)
'confidence': [8, 9, 9, 6, 7, 5, 6, 4, 8, 7],
'category': [
'Player Evaluation', 'Game Strategy', 'Game Strategy',
'Player Evaluation', 'Game Strategy', 'Personnel',
'Research', 'Player Health', 'Personnel', 'Personnel'
]
})
# Apply prioritization function
prioritized = prioritize_projects(projects)
# Display summary of top priorities
print("Analytics Project Prioritization")
print("=" * 80)
print("\nTop Priorities:")
print(prioritized[['rank', 'project', 'category', 'priority_score']].head(10).to_string(index=False))
# Create visualizations showing prioritization logic
fig, axes = plt.subplots(1, 2, figsize=(14, 6))
# 1. Priority scores bar chart
ax1 = axes[0]
colors = plt.cm.RdYlGn(prioritized['priority_score'] / 10)
ax1.barh(prioritized['project'], prioritized['priority_score'], color=colors)
ax1.set_xlabel('Priority Score', fontsize=11)
ax1.set_title('Project Priority Rankings', fontsize=13, fontweight='bold')
ax1.invert_yaxis() # Highest priority at top
ax1.grid(axis='x', alpha=0.3)
# 2. Impact vs Urgency matrix
# This visualization helps identify projects that are both high-impact and urgent
# (upper-right quadrant) vs. low-impact and non-urgent (lower-left quadrant)
ax2 = axes[1]
scatter = ax2.scatter(
prioritized['impact'],
prioritized['urgency'],
s=prioritized['priority_score'] * 30, # Bubble size = priority
c=prioritized['priority_score'], # Color = priority
cmap='RdYlGn',
alpha=0.6,
edgecolors='black'
)
ax2.set_xlabel('Impact (1-10)', fontsize=11)
ax2.set_ylabel('Urgency (1-10)', fontsize=11)
ax2.set_title('Impact vs Urgency Matrix', fontsize=13, fontweight='bold')
ax2.grid(alpha=0.3)
# Add quadrant lines to help interpret the matrix
# Upper-right quadrant = high impact + high urgency = top priority
# Lower-left quadrant = low impact + low urgency = lower priority
ax2.axhline(y=5.5, color='gray', linestyle='--', alpha=0.5)
ax2.axvline(x=5.5, color='gray', linestyle='--', alpha=0.5)
plt.colorbar(scatter, ax=ax2, label='Priority Score')
plt.tight_layout()
plt.show()
The prioritization framework produces concrete rankings, but how should these scores be interpreted? The following criteria provide guidance for evaluating projects across each dimension:
Prioritization Criteria
These scoring rubrics help teams consistently evaluate projects across the four key dimensions. While some subjectivity remains, explicit criteria reduce inconsistency and enable productive discussions about scores.
Impact (40% weight): How much will this improve decision-making quality or provide competitive advantage?
- 10: Game-changing strategic advantage that fundamentally improves major decisions (e.g., a draft model that significantly outperforms existing approaches, a fourth-down model that changes strategy)
- 7-9: Significant competitive benefit for important recurring decisions (e.g., improved weekly opponent analysis, better free agency targeting, optimized play-calling)
- 4-6: Modest improvement to existing processes or provides useful information for occasional decisions (e.g., enhanced scouting reports, trade value estimation, depth chart optimization)
- 1-3: Minimal impact on actual decisions; primarily informational or exploratory (e.g., historical trend analysis, benchmarking against other teams)
Consider: Who uses this? How often? What decisions does it inform? What's the current alternative? How much better is this?
Urgency (25% weight): When is this needed? What's the cost of delay?
- 10: Needed this week or sooner (e.g., analysis for this Sunday's game, immediate response to injury or roster move)
- 7-9: Needed this month or within current season phase (e.g., draft prep analysis before combines, free agency analysis before market opens)
- 4-6: Needed this season or fiscal year but no immediate deadline (e.g., offseason research projects, infrastructure improvements for next season)
- 1-3: No time pressure; nice-to-have or long-term strategic projects (e.g., exploratory research, process improvements, documentation projects)
Consider: What's the deadline? What happens if we deliver late? Does urgency reflect real decision timing or arbitrary preferences?
Resources (20% weight): How much effort is required to complete this successfully?
- 1-3: Can be completed in days with existing tools and data (e.g., running existing model on new data, generating routine report)
- 4-6: Requires weeks of work, possibly including new data acquisition or moderate methodology development (e.g., building new visualization dashboard, developing a focused prediction model)
- 7-9: Requires months of sustained effort, possibly involving new data partnerships, significant infrastructure work, or complex modeling (e.g., integrating tracking data, building comprehensive draft model)
- 10: Requires quarters or years; large-scale projects involving multiple people, major infrastructure changes, or research-level innovation (e.g., computer vision system for film analysis, complete data platform rebuild)
Consider: How many person-weeks of work? What skills required? What dependencies exist? Can it be scoped down?
Confidence (15% weight): How likely are we to succeed given available data, methods, and expertise?
- 10: Proven methodology with available high-quality data; very confident we can deliver valuable results (e.g., EPA-based analysis using nflfastR data)
- 7-9: Good approach with some uncertainty about data quality or methodology; likely to succeed with acceptable iteration (e.g., draft model using combine metrics and college stats)
- 4-6: Experimental approach or questionable data quality; may work but significant risk of limited value (e.g., injury prediction using limited public data)
- 1-3: High risk, exploratory work; unclear if approach will yield useful results (e.g., predicting play-calling based on coordinator gestures, using social media sentiment to predict performance)
Consider: Do we have the right data? Do we know how to approach this? Have others succeeded at similar problems? What could go wrong?
Project Prioritization Best Practices
Effective prioritization requires more than just running the scoring model. Implement these practices to maximize impact: 1. **Revisit regularly**: Priorities change with seasons, organizational needs, and new information. Review and re-score projects monthly during offseason, weekly during season. 2. **Consider portfolio balance**: Don't only do urgent projects while neglecting important long-term work. Reserve capacity (e.g., 20% of team time) for research and infrastructure even during busy periods. 3. **Stakeholder input**: Include decision-makers in the prioritization process, particularly for scoring Impact and Urgency. Their perspective on which projects matter most is essential. 4. **Quick wins**: Balance long-term ambitious projects with short-term deliverables that build credibility and demonstrate value. Especially important for new teams. 5. **Say no thoughtfully**: Protecting the team from low-value requests is a core leadership responsibility. Use the prioritization framework to explain why certain requests are deferred: "This scores as priority #15; we're currently working on priorities #1-5." 6. **Document decisions**: When projects are deprioritized, document why. This prevents relitigating the same decisions repeatedly and helps stakeholders understand trade-offs. 7. **Negotiate scope**: If a low-priority project is politically important, ask "Could we do a simplified version that requires fewer resources?" Often a quick analysis provides 80% of the value at 20% of the cost. 8. **Communicate transparently**: Share the prioritization framework and current rankings with stakeholders. This builds trust and helps them understand why their request might not be happening immediately.Pitfalls to Avoid
Common mistakes in project prioritization: **Only doing urgent work**: Teams that respond purely to urgency become firefighters, constantly reacting rather than proactively building capabilities. Reserve capacity for important non-urgent work. **Ignoring stakeholder politics**: Sometimes a technically low-priority project matters politically (new GM's priority, relationship-building with skeptical coach). Pure scoring models can be too rigid. **Perfectionism paralysis**: Don't wait to start important projects until you can do them perfectly. Better to deliver good analysis on time than perfect analysis too late. **Scope creep**: Projects often expand beyond initial scope. Actively manage scope and be willing to deliver Phase 1 before committing to Phase 2. **Sunk cost fallacy**: If a project isn't working out, be willing to stop and reallocate resources rather than persisting because you've already invested time. **Individual preferences dominating**: Team members naturally gravitate toward projects they find interesting. Ensure organizational priorities, not personal preferences, drive allocation.Agile Methodologies for Analytics
Agile project management, originally developed for software engineering, provides frameworks that help analytics teams deliver value incrementally, respond to changing needs, and maintain alignment despite uncertainty. While football analytics differs from software development in important ways, adapted agile practices significantly improve team effectiveness, particularly for teams larger than 2-3 people.
Traditional project management approaches—developing detailed plans upfront, executing according to the plan, delivering at the end—work poorly for analytics work. Analysis often reveals unexpected findings that change the direction of inquiry. Stakeholder needs evolve as they see preliminary results. Data quality issues emerge mid-project. Requirements are often unclear at the start. Agile methodologies explicitly embrace this uncertainty and build in mechanisms for adaptation.
The core agile principles that translate well to analytics:
- Iterative development: Deliver working analysis in cycles rather than one big delivery
- Regular feedback: Frequent stakeholder check-ins to ensure work remains valuable
- Transparent workflow: Visible work status so everyone knows what's happening
- Retrospection and improvement: Regular reflection on process to continuously improve
- Adaptability: Embrace changing priorities rather than rigidly sticking to initial plans
Sprint-Based Workflows
Sprints—fixed time periods (typically 1-4 weeks) with defined goals and deliverables—provide structure and rhythm to analytical work. The sprint model works particularly well for analytics teams because it creates natural points for:
- Committing to specific deliverables
- Reviewing progress with stakeholders
- Reflecting on what's working and what isn't
- Adjusting priorities based on new information
Two-Week Sprints: The Standard Cycle
Most analytics teams find two-week sprints provide the right balance: long enough to complete meaningful work, short enough to adapt quickly and maintain focus.
Sprint Structure and Cadence:
1. Sprint Planning (Monday, Week 1 - 1-2 hours)
The team meets to commit to work for the upcoming sprint:
- Review backlog: Examine prioritized list of potential projects and tasks
- Select sprint goals: Choose 2-4 major goals for the sprint based on priorities, capacity, and dependencies
- Break down tasks: Decompose goals into specific tasks (e.g., "Update draft model" becomes: pull new data, validate data quality, retrain model, test predictions, document changes, present results)
- Assign ownership: Designate a person responsible for each task (not necessarily doing all the work, but owning its completion)
- Define "done" criteria: Explicitly state what "done" means (code reviewed? documented? presented? deployed?)
- Capacity check: Ensure committed work is realistic given team capacity (account for meetings, ongoing responsibilities, known time off)
Best Practice: Don't overcommit. Better to finish what you commit to than constantly carry work across sprints, which demoralizes teams and frustrates stakeholders.
2. Daily Standups (Every Day - 15 minutes maximum)
Brief synchronous meeting (can be Slack-based for distributed teams) where each person answers:
- What did you accomplish yesterday? "Finished cleaning the draft data and identified 3 data quality issues we need to address"
- What will you work on today? "Fix the data quality issues, then start retraining the model"
- Any blockers or issues? "Waiting on coach feedback about the opponent report format"
The standup is not for problem-solving (that happens offline) but for coordination and obstacle identification. If discussions run long, table them for smaller follow-up conversations.
Async Alternative: For small teams or distributed teams across time zones, replace synchronous standups with daily Slack updates in a dedicated channel.
3. Sprint Review (Friday, Week 2 - 1 hour)
Demonstrate completed work to stakeholders and gather feedback:
- Demo deliverables: Show completed analyses, tools, visualizations to relevant stakeholders
- Discuss findings: Explain insights, implications, and recommendations
- Gather feedback: Does this answer the right question? Is it actionable? What would make it more useful?
- Update backlog: Based on feedback, add new tasks, adjust priorities, or refine future work
Key Principle: Only demo work that's actually done (meets your "definition of done"). Showing half-finished work undermines credibility.
4. Sprint Retrospective (Friday, Week 2, after Review - 30-45 minutes)
Team-only meeting reflecting on the sprint process:
- What went well? "We finished all committed work and the draft model presentation went smoothly"
- What could improve? "We got too many urgent requests mid-sprint that disrupted planned work"
- Action items for next sprint: "Block first hour each day for deep work with no meetings; create intake form for requests"
Critical Success Factor: Retrospectives only work if the team feels safe being honest. Leaders must not get defensive about criticism and must demonstrate willingness to change based on feedback.
Kanban Boards for Workflow Visualization
Kanban boards provide visual transparency into work status, helping teams coordinate and stakeholders understand current capacity. The board shows all work items and their current state, making bottlenecks and capacity constraints visible.
Basic Kanban Structure:
Columns (representing workflow stages):
- Backlog: Ideas and requests not yet prioritized or scoped
- To Do: Prioritized work ready to start (well-defined, unblocked)
- In Progress: Work currently being actively worked on
- Review: Completed work awaiting feedback, code review, or validation
- Done: Finished and delivered work
Cards (representing individual work items):
Each card includes:
- Descriptive title
- Assigned owner
- Priority level (High/Medium/Low)
- Brief description
- Links to related documents or code
Workflow:
Cards flow left-to-right as work progresses. Team members pull new work from "To Do" when they have capacity, rather than having work pushed onto them.
The following visualization shows a typical Kanban board state during an active sprint:
#| label: kanban-example-r
#| message: false
#| warning: false
library(tidyverse)
# Example Kanban board state during an active sprint
# This represents realistic work distribution across workflow stages
kanban_data <- tribble(
~task, ~status, ~assignee, ~priority,
"Draft Model Updates", "In Progress", "Sarah", "High",
"Weekly Report Automation", "In Progress", "Mike", "Medium",
"4th Down Tool Testing", "Review", "Sarah", "High",
"FA Model Documentation", "To Do", "Alex", "Medium",
"Salary Cap Dashboard", "To Do", "Mike", "Low",
"Red Zone Analysis", "Done", "Sarah", "Medium",
"Injury Risk Model", "Backlog", "Unassigned", "Low",
"Play Type Prediction", "Backlog", "Unassigned", "Medium"
)
# Count tasks by status and priority
# This summary helps quickly assess workflow balance
status_summary <- kanban_data %>%
count(status, priority) %>%
# Ensure status ordering matches workflow progression
mutate(status = factor(status, levels = c("Backlog", "To Do", "In Progress", "Review", "Done")))
# Visualize current sprint work distribution
ggplot(kanban_data, aes(x = status, fill = priority)) +
geom_bar() +
# Use traffic light colors: red (high), orange (medium), green (low)
scale_fill_manual(
values = c("High" = "#d32f2f", "Medium" = "#f57c00", "Low" = "#388e3c")
) +
labs(
title = "Analytics Team Kanban Board",
subtitle = "Current sprint work status",
x = "Status",
y = "Number of Tasks",
fill = "Priority"
) +
theme_minimal() +
theme(
plot.title = element_text(face = "bold", size = 14),
axis.text.x = element_text(angle = 0, hjust = 0.5)
)
#| label: kanban-example-py
#| message: false
#| warning: false
import pandas as pd
import matplotlib.pyplot as plt
# Example Kanban board showing typical mid-sprint state
kanban_df = pd.DataFrame({
'task': [
'Draft Model Updates', 'Weekly Report Automation',
'4th Down Tool Testing', 'FA Model Documentation',
'Salary Cap Dashboard', 'Red Zone Analysis',
'Injury Risk Model', 'Play Type Prediction'
],
'status': [
'In Progress', 'In Progress', 'Review', 'To Do',
'To Do', 'Done', 'Backlog', 'Backlog'
],
'assignee': [
'Sarah', 'Mike', 'Sarah', 'Alex',
'Mike', 'Sarah', 'Unassigned', 'Unassigned'
],
'priority': [
'High', 'Medium', 'High', 'Medium',
'Low', 'Medium', 'Low', 'Medium'
]
})
# Define workflow stage ordering (left-to-right progression)
status_order = ['Backlog', 'To Do', 'In Progress', 'Review', 'Done']
kanban_df['status'] = pd.Categorical(kanban_df['status'], categories=status_order, ordered=True)
# Aggregate by status and priority for stacked bar visualization
status_counts = kanban_df.groupby(['status', 'priority']).size().unstack(fill_value=0)
# Create stacked bar chart showing work distribution
fig, ax = plt.subplots(figsize=(10, 6))
status_counts.plot(
kind='bar',
stacked=True,
# Traffic light colors for priority
color={'High': '#d32f2f', 'Medium': '#f57c00', 'Low': '#388e3c'},
ax=ax,
alpha=0.8
)
ax.set_xlabel('Status', fontsize=12)
ax.set_ylabel('Number of Tasks', fontsize=12)
ax.set_title('Analytics Team Kanban Board\nCurrent Sprint Work Status',
fontsize=14, fontweight='bold')
ax.set_xticklabels(ax.get_xticklabels(), rotation=0)
ax.legend(title='Priority', loc='upper left')
ax.grid(axis='y', alpha=0.3)
plt.tight_layout()
plt.show()
# Print summary statistics for quick status check
print("\nCurrent Sprint Summary:")
print(kanban_df.groupby('status').size())
print("\nWork by Team Member:")
print(kanban_df[kanban_df['assignee'] != 'Unassigned'].groupby('assignee').size())
Kanban Best Practices for Analytics Teams
**Limit Work-in-Progress (WIP)**: - Set maximum number of items allowed in "In Progress" (typically equal to team size) - This forces finishing current work before starting new work - Reduces context-switching and improves focus - Makes blockers more visible and painful, encouraging resolution **Make Everything Visible**: - Include ALL work on the board, even small tasks and operational work - This prevents the "invisible workload" problem where stakeholders don't understand why the team can't take on new projects - Shows the full scope of commitments, not just major projects **Update Regularly**: - Team members should update board daily (moving cards, adding comments) - Out-of-date boards lose utility quickly - Consider displaying board on TV/monitor in team area for constant visibility **Use Digital Tools**: - Physical boards work for co-located teams but digital boards (Jira, Trello, Asana) enable remote collaboration - Digital boards can automatically track metrics like cycle time and throughput - Choose tools that integrate with your workflow, don't force workflow to fit toolsAdapting Agile for Football Season
Football's annual cycle—with intense periods during the season and more strategic planning time in the offseason—requires adapting agile practices to match organizational rhythms.
In-Season Adjustments (September - February):
During the season, urgency peaks and timelines compress. Weekly game preparation dominates, leaving limited capacity for long-term projects.
Recommended Adjustments:
- Shorter sprints (1 week instead of 2): Align with weekly game cycle. Sprint goals might be "Support Week 7 opponent preparation" or "Update playoff probability models"
- Flexible priorities: Accept that game-week requests will disrupt plans. Reserve 40-50% of capacity for urgent requests
- Protected time for routine work: Batch similar requests (e.g., all weekly reports generated Friday afternoon) to create predictability
- Defer research: Minimize experimental projects during season unless they have immediate applicability
- More frequent check-ins: Daily standups become critical for coordinating rapidly shifting priorities
Sample In-Season Sprint Goals:
- Support Week X opponent preparation (reports, matchup analysis, tendency breakdowns)
- Update playoff probability models weekly
- Provide in-game decision support (4th down, 2-point conversion recommendations)
- Respond to ad-hoc coaching staff requests within 24-48 hours
- Maintain one strategic project progressing incrementally (e.g., draft model prep)
Off-Season Adjustments (March - August):
The offseason provides breathing room for strategic work, infrastructure improvements, and research that's impossible during the season.
Recommended Adjustments:
- Longer sprints (3-4 weeks): Allows sustained focus on complex projects
- Focus on infrastructure: Data pipelines, tool development, process documentation
- Research time: Experimental methodologies, exploring new data sources, academic collaborations
- Professional development: Courses, conferences, skill-building, reading groups
- Planning for season: Develop tools and analyses that will support in-season needs
- Deliberate stakeholder engagement: Regular check-ins with coaches and scouts to build relationships when there's time for conversation
Sample Off-Season Sprint Goals:
- Complete draft model rebuild with new features
- Develop automated weekly report generation system
- Integrate new tracking data source
- Build interactive 4th down decision tool
- Present research findings at conference
- Document all analytical processes and methodologies
Balance Urgent and Important Work
The season's urgent requests can completely consume analytics teams if not carefully managed. Protect capacity for important non-urgent work even during the season: - **Reserve 20% of capacity** for strategic projects that progress incrementally - **Say no to low-value urgent requests** (not everything urgent is important) - **Automate routine seasonal work** so it doesn't require manual effort each week - **Pre-build seasonal tools** in offseason so season work is execution, not development Teams that spend 100% of season capacity on urgent requests never build the infrastructure and capabilities that reduce future urgency.Summary
Running an effective football analytics department requires balancing technical excellence with organizational leadership. Success depends on:
- Team Structure: Appropriate size and specialization for organizational needs, budget, and strategy
- Talent Management: Hiring skilled analysts who combine technical competence with football knowledge and communication abilities, then developing their capabilities
- Project Prioritization: Strategic allocation of limited resources toward highest-impact work
- Agile Methods: Flexible workflows adapted to football's seasonal rhythms
- Quality Standards: Professional code, documentation, and version control practices
- Collaboration: Building trust with coaches, scouts, and executives through consistent delivery and effective communication
- Infrastructure: Robust technical systems supporting both routine reporting and experimental research
- Culture: Continuous learning, celebrating wins, and managing failure constructively
- Career Development: Clear paths for technical and leadership growth
Analytics leaders must navigate unique challenges: compressed timelines where decisions are needed in days not weeks, high stakes where mistakes are public and consequential, cultural resistance from those who question analytics' value, and resource constraints with smaller teams than corporate analytics departments. Success requires technical expertise (statistical rigor, programming proficiency), football knowledge (understanding the game, its strategy, and its culture), communication skills (translating technical work for non-technical audiences), and organizational savvy (navigating politics, building relationships, demonstrating value).
The most effective analytics departments don't just produce good analysis—they integrate analytics into decision-making processes, build collaborative relationships across the organization, and create cultures where data-driven thinking becomes second nature. This integration requires intentional effort: developing tools decision-makers actually use, presenting insights in formats that resonate with different audiences, building trust through consistent delivery and intellectual honesty, and demonstrating respect for traditional football expertise while offering analytical perspectives.
Key Takeaways
1. **Right size matters**: Match team structure to organizational needs and resources. Better to start small and grow based on demonstrated value than build large teams speculatively. 2. **Hire for fit**: Balance technical skills, football knowledge, and communication abilities. The best analyst who can't influence decisions provides limited value. 3. **Prioritize ruthlessly**: Focus on high-impact projects tied to actual decisions. Saying no to low-value work is a core leadership responsibility. 4. **Maintain quality**: Professional standards for code, documentation, and processes ensure institutional knowledge survives personnel changes. 5. **Build relationships**: Success requires trust and collaboration across the organization. Invest time in informal relationship-building, not just formal presentations. 6. **Invest in infrastructure**: Robust technical systems enable efficient analytics. Time spent automating routine work pays dividends throughout the season. 7. **Foster learning**: Continuous improvement through education, knowledge sharing, and deliberate practice keeps teams at the frontier. 8. **Develop people**: Create career paths and growth opportunities. Retention is cheaper than replacement, and experienced analysts are more valuable. 9. **Adapt to seasons**: Flexible workflows accommodate in-season urgency while protecting offseason capacity for strategic projects. 10. **Lead with empathy**: Understand stakeholder needs, communicate in their language, and respect traditional football expertise while offering analytical perspectives.Exercises
Conceptual Questions
-
Team Design: You're starting an analytics department with a budget for 4 people. Design your team structure, defining roles and responsibilities for each position. Justify your choices based on organizational priorities (player evaluation vs. game strategy), available skills in the hiring market, and likely stakeholder needs. What trade-offs are you making, and how might you adjust the structure after year one based on different scenarios?
-
Cultural Resistance: A veteran coach dismisses your 4th down recommendation, saying "analytics doesn't understand football—you can't reduce this game to numbers." How do you respond in the moment? What longer-term strategies do you employ to build trust with this coach? How do you balance confidence in your analysis with humility about its limitations?
-
Resource Allocation: You have bandwidth for 2 major projects this offseason. The GM wants a comprehensive draft model, the head coach wants weekly opponent reports redesigned, and you believe a roster construction optimization tool would provide significant value. How do you prioritize? What factors influence your decision? How do you communicate your reasoning to stakeholders whose projects you're deferring?
-
Quality vs Speed: During the season, you're asked to deliver an analysis by tomorrow morning for a decision meeting. You can do a quick but less rigorous analysis in the time available, or take 2 additional days for a thorough investigation with proper validation. How do you decide? How do you communicate trade-offs to the stakeholder? What factors would push you toward speed vs. quality?
-
Career Development: An analyst has been with your team for 3 years and is performing well technically but shows little interest in people management or organizational politics. They want to grow but on a technical track. How do you create growth opportunities? What does a "principal analyst" or "staff analyst" role look like? How do you retain this valuable contributor without forcing them into management?
-
Organizational Change: A regime change brings a new GM and coaching staff who are skeptical of analytics. The previous leadership strongly supported your department. How do you approach this transition? What strategies do you employ to demonstrate value to the new leadership? How do you protect your team while being open to legitimate criticism?
Practical Exercises
Exercise 1: Build a Project Prioritization System
Create a prioritization framework customized for your organization: **Tasks**: a) Define 5-7 criteria for evaluating projects (you can modify the 4 in the chapter or create entirely new ones based on organizational context) b) Assign weights to each criterion based on organizational priorities (interview stakeholders or make informed assumptions) c) Create a detailed scoring rubric for each criterion on a 1-10 scale with explicit definitions d) Build a function/tool that calculates priority scores (R function, Python script, or Excel model) e) Apply it to 10-15 example projects from your organization or realistic hypothetical projects f) Visualize results showing top priorities, explaining rationale, and showing sensitivity to weight changes g) Write a 1-page memo explaining how to use this framework to stakeholders **Advanced Extensions**: - Add dynamic weighting that changes based on season phase (in-season vs. offseason) - Include capacity constraints (e.g., can't commit to more than X person-weeks of work per sprint) - Develop a Shiny or Streamlit app allowing stakeholders to interactively adjust weights and see how priorities changeExercise 2: Code Quality Audit
Audit the quality of an existing analytics project (your own or a public GitHub repository): **Assessment Areas**: a) **Code style and readability**: Consistent naming conventions, appropriate commenting, logical organization, functions vs. scripts b) **Version control usage**: Meaningful commit messages, appropriate branch structure, frequency of commits c) **Documentation completeness**: README explaining what/why/how, function documentation, inline comments for complex logic d) **Reproducibility**: Can someone else run this code? Are dependencies documented? Is there sample data? e) **Security and best practices**: Any hardcoded credentials? Sensitive data in repo? Appropriate .gitignore? f) **Testing**: Any unit tests? Validation checks? Error handling? **Deliverable**: Write a 2-3 page code review memo with: - Executive summary (overall assessment) - Scores for each area (1-10 scale with justification) - Specific strengths to celebrate - Specific improvement recommendations prioritized by importance - Example before/after code snippets showing improvements **Advanced Extension**: Develop a code quality rubric and checklist your team can use for all projects going forward.Exercise 3: Design a Collaboration Tool
Build an interactive tool for coaches or scouts: **Project Phases**: a) **Identify decision problem**: Choose a specific recurring decision (4th downs, 2-point conversions, play-calling in specific situations, draft day trade value, etc.) b) **User research**: Interview potential users about current decision process, what information they use, what they wish they knew c) **Design interface**: Sketch wireframes showing inputs users provide and outputs they receive d) **Implement backend**: Build model or calculation engine (can be simplified for prototype) e) **Create visualizations**: Display recommendations with supporting evidence visually f) **Add context and explanation**: Help users understand why the recommendation makes sense g) **User testing**: Have non-technical users (family members, friends, volunteers) try the tool and gather feedback h) **Iterate**: Improve based on feedback **Technology Options**: - R Shiny (good for statisticians, integrates with tidyverse) - Python Streamlit (increasingly popular, clean syntax) - Tableau dashboard (good for organizations already using Tableau) - Web app (HTML/CSS/JavaScript with backend API if you have those skills) **Advanced Extensions**: - Add ability to save scenarios and compare alternatives - Export functionality (PDF reports, Excel spreadsheets) - Historical tracking showing how recommendations performed - Confidence intervals and uncertainty quantificationExercise 4: Build an Automated Report Pipeline
Create an automated reporting system for a recurring analytics need: **Components**: a) **Report design**: Create template for weekly opponent report, draft prospect summary, or similar recurring output b) **Data extraction**: Build pipeline to pull required data from sources (nflfastR, databases, APIs) c) **Data transformation**: Calculate statistics, aggregate appropriately, handle missing data d) **Visualization generation**: Create charts and tables programmatically (ggplot, matplotlib, etc.) e) **Report compilation**: Combine components into formatted output (RMarkdown to PDF/HTML, Jupyter to HTML, etc.) f) **Automation**: Schedule automatic execution (cron job, GitHub Actions, Airflow, Windows Task Scheduler) g) **Error handling**: Detect problems (missing data, failed calculations) and send alerts h) **Delivery**: Email report, post to Slack, save to shared drive, or other delivery mechanism **Quality Criteria**: - Runs without manual intervention - Handles common errors gracefully - Produces professional-looking output - Includes data validation checks - Logs execution for debugging - Documents what it does and how to maintain it **Advanced Extensions**: - Conditional content based on data (e.g., automatically flag unusual patterns) - Parameterized reports (generate for different teams, weeks, etc. with one script) - Version control for outputs (track how reports changed over time) - A/B testing different report formats with stakeholdersExercise 5: Career Development Framework
Create a comprehensive career development framework for analytics professionals: **Deliverables**: a) **Career ladder definition**: Define 5-7 levels from entry-level through director, with: - Level titles - Typical years of experience - Reporting relationships - Approximate compensation ranges b) **Competency expectations**: For each level, specify expectations across: - Technical skills (programming, statistics, ML, data engineering) - Domain knowledge (football understanding, analytics frameworks) - Communication (presenting, writing, visualization) - Leadership (mentorship, project management, strategy) - Organizational impact (scope of influence, decision involvement) c) **Skill assessment rubrics**: Create detailed rubrics for evaluating each competency area (1-5 scale with behavioral descriptions) d) **Promotion criteria**: Define what it takes to move from level N to level N+1, including: - Required competencies at target level - Demonstration period (how long must they perform at next level?) - Review process e) **Individual development plan template**: Create template employees and managers use for development conversations, including: - Current level and competency assessment - Target level and timeline - Development goals (specific, measurable) - Learning activities and resources - Check-in schedule f) **Training resources guide**: Compile resources for developing each competency: - Online courses (Coursera, DataCamp, etc.) - Books and papers - Internal training opportunities - Conferences and events - Mentorship opportunities g) **Alternative track design**: Design parallel tracks for those who don't want people management: - Individual contributor (IC) track for deep technical specialists - Equivalence between management and IC tracks at each level - How IC roles contribute without managing people **Advanced Extension**: Conduct market research on analytics salaries across NFL teams, college programs, and related industries to benchmark compensation ranges realistically.Exercise 6: Stakeholder Communication
Practice communicating analytics to different audiences: **Project**: a) Select a complex analytical finding (e.g., ML model predicting draft success, fourth down optimization, player aging curves, etc.) b) Create 4 different communication artifacts for different stakeholders: **Version 1 - Executive Summary (for GM)**: - 1-page maximum - Lead with bottom line recommendation - High-level methodology in 1-2 sentences - Key supporting evidence - Next steps **Version 2 - Technical Deep-Dive (for analytics peers)**: - 5-10 pages with detailed methodology - Statistical approach and validation - Code snippets or pseudocode - Limitations and assumptions - Areas for future work **Version 3 - Visual Presentation (for coaches)**: - 5-7 slide deck - Heavy on visuals, light on text - Football terminology, not statistical jargon - Actionable insights, not methodology - Anticipate questions **Version 4 - Implementation Guide (for scouts)**: - 2-3 pages with specific instructions - How to use this in evaluation process - What to look for - Examples and case studies - FAQ section c) **Feedback collection**: Present each version to someone representative of that audience (or role-play) and gather feedback: - Was it clear? - Was the right amount of detail? - What would have made it more useful? - What confused them? d) **Reflection**: Write 1-2 pages reflecting on: - What varied most across versions? - What was hardest about adapting your message? - What did you learn about audience needs? - How will this influence future communication? **Goal**: Practice the critical skill of adapting technical work for diverse audiences with different needs, priorities, and technical backgrounds.Further Reading
Books
Management and Leadership:
- "High Output Management" by Andy Grove - Management fundamentals applicable to analytics leadership
- "The Manager's Path" by Camille Fournier - Engineering leadership guidance that translates well to analytics
- "Radical Candor" by Kim Scott - Framework for giving feedback and having difficult conversations
- "Team of Teams" by General Stanley McChrystal - Organizational agility and coordination at scale
- "The Five Dysfunctions of a Team" by Patrick Lencioni - Building high-trust, high-performing teams
- "An Elegant Puzzle" by Will Larson - Systems thinking approach to engineering management
Decision-Making and Analytics:
- "Thinking in Bets" by Annie Duke - Decision-making under uncertainty, highly relevant to sports
- "The Signal and the Noise" by Nate Silver - Forecasting, prediction, and distinguishing signal from noise
- "Superforecasting" by Philip Tetlock - How to make better predictions through process and practice
- "How to Measure Anything" by Douglas Hubbard - Quantifying seemingly unquantifiable things
Technical Resources:
- "R for Data Science" by Wickham & Grolemund - Comprehensive R programming and workflows
- "Python for Data Analysis" by McKinney - Python data science fundamentals
- "The Pragmatic Programmer" by Hunt & Thomas - Software craftsmanship principles
- "Clean Code" by Robert Martin - Code quality and readability principles
- "Building Machine Learning Powered Applications" by Ameisen - Putting ML into production
Sports Analytics Specific:
- "The Numbers Game" by Anderson & Sally - Soccer analytics with lessons for football
- "Basketball on Paper" by Oliver - Pioneering basketball analytics work
- "Mathletics" by Wayne Winston - Sports analytics across multiple sports
Articles and Online Resources
Conferences:
- MIT Sloan Sports Analytics Conference - Premier sports analytics conference, proceedings available online
- Carnegie Mellon Sports Analytics Conference - Strong academic focus, papers published
- Great Lakes Analytics Conference - Midwest-focused sports analytics conference
- NFL Analytics Summit - Invite-only but materials sometimes shared
Blogs and Websites:
- Open Source Football (opensourcefootball.com) - Community blog with methodological discussions
- Ben Baldwin's Substack - From the creator of nflfastR, analytical deep-dives
- Football Outsiders - Long-running analytics site with DVOA and other metrics
- The Athletic - Subscription site with analytics-focused NFL coverage
Social Media:
- Twitter/X Football Analytics Community - Follow @nflfastR, @benbbaldwin, @MikeClayNFL, @KieranKnight, @PFF_Moo for methodology discussions
- r/NFLAnalytics on Reddit - Community discussions of analytical approaches
Industry Resources:
- Team analytics blogs - Eagles, Ravens, Browns have published analytical approaches
- NFL NextGen Stats - Official NFL tracking data and analytics content
- PFF (Pro Football Focus) - Commercial analytics company publishing research
Academic and Technical Resources
Statistics and Methodology:
- "Statistical Rethinking" by Richard McElreath - Bayesian statistics with practical focus
- "An Introduction to Statistical Learning" by James et al. - Machine learning fundamentals
- "Regression and Other Stories" by Gelman, Hill, & Vehtari - Practical regression modeling
Organizational and Strategic:
- "Competing on Analytics" by Davenport & Harris - Building analytical competitive advantage
- "Measure What Matters" by John Doerr - OKRs and goal-setting frameworks
- Harvard Business Review - Regular articles on analytics leadership and data science management
References
:::