Teaching Mobile Game Logic That Actually Works

We've spent eight years building games that people play for months, not minutes. Now we're showing others how matchmaking algorithms, progression curves, and economy balancing actually function when millions of players interact with them daily.

View Program Details
Students analyzing game algorithm performance metrics during practical workshop session
14

Weeks Duration

92%

Completion Rate

18

Real Projects

Technical documentation and code review session focusing on matchmaking systems

Numbers From Our Last Cohort

Between September and December 2024, we ran our most recent program. Out of 22 enrolled students, 20 finished all assignments and presented their final algorithm implementations.

The 14-week format gives people time to absorb complex systems while maintaining day jobs. Most participants spent 12-15 hours weekly on coursework and projects. We cover 18 different algorithm types across matchmaking, difficulty adjustment, monetization balancing, and player retention prediction.

Students work with actual anonymized datasets from games we've shipped — nothing synthetic or theoretical. You'll see messy player behavior, edge cases we didn't anticipate, and problems that required creative solutions when standard approaches failed.

How the Program Actually Unfolds

  • 1

    Weeks 1-4: Foundation Systems

    You start with player segmentation algorithms and basic matchmaking logic. We use C# and Python interchangeably depending on what makes each concept clearest. The first assignment involves building a skill-rating system from scratch — no libraries, just math and logic.

  • 2

    Weeks 5-9: Complex Balancing

    Here's where it gets interesting. You'll work on economy simulation — how in-game currency flows affect player behavior over weeks and months. We examine real failures from games that miscalculated their reward schedules and had to emergency-patch their systems.

  • 3

    Weeks 10-12: Predictive Models

    Machine learning enters the picture. Not fancy deep learning — practical classification and regression models that predict churn risk, monetization probability, and content engagement. You'll train models on our historical data and compare results against what actually happened.

  • 4

    Weeks 13-14: Integration Project

    Your final assignment combines everything. Design and implement a complete algorithmic system for a hypothetical game scenario we provide. Previous students have built dynamic difficulty systems, social graph analyzers, and seasonal event schedulers.

Detailed whiteboard session explaining player progression curve mathematics

What You'll Actually Be Able to Do

We don't promise job placements or salary increases. What we can tell you is what skills you'll have by week 14, based on what every previous cohort has demonstrated.

Build Production-Ready Systems

You'll know how to implement matchmaking that handles 50,000 concurrent players without creating 10-minute wait times. You'll understand the tradeoffs between fairness, speed, and computational cost.

  • ELO and TrueSkill rating calculations with real-time updates
  • Queue management algorithms that balance multiple constraints
  • Server load distribution for geographically dispersed players
  • Handling edge cases like smurfing and rank manipulation
Live demonstration of algorithmic performance optimization in mobile game environment

Diagnose Failed Systems

More valuable than building from scratch is fixing what's broken. You'll learn to analyze player data, identify where algorithms are misbehaving, and propose targeted fixes rather than wholesale rewrites.

We spend considerable time on debugging methodologies specific to game systems where traditional testing approaches don't work well.

Communicate Technical Decisions

Every algorithm involves tradeoffs. Our program emphasizes documentation and justification — explaining why you chose approach A over approach B in terms business stakeholders understand.

You'll practice presenting technical implementations to non-technical audiences, which is often where good algorithms get rejected due to poor communication.

Daren Wicklow, systems designer who completed the 2024 autumn program
"
I came in thinking I understood matchmaking because I'd read some blog posts and implemented a simple version once. Turns out I knew maybe 20% of what actually matters when you scale to real player populations. The assignments were frustrating in the best way — my first attempts failed for reasons I hadn't considered, which forced me to think more carefully about edge cases and system interactions.

Daren Wicklow

Systems Designer, Completed Autumn 2024 Session