Milestone System#

Recreate ML History with YOUR Code

Run the algorithms that changed the world using the TinyTorch you built from scratch

Purpose: The milestone system lets you run famous ML algorithms (1957-2018) using YOUR implementations. Every milestone validates that your code can recreate a historical breakthrough.

See Historical Milestones for the full historical context and significance of each milestone.

What Are Milestones?#

Milestones are runnable recreations of historical ML papers that use YOUR TinyTorch implementations:

  • 1957 - Rosenblatt’s Perceptron: The first trainable neural network

  • 1969 - XOR Solution: Solving the problem that stalled AI

  • 1986 - Backpropagation: The MLP revival (Rumelhart, Hinton & Williams)

  • 1998 - LeNet: Yann LeCun’s CNN breakthrough

  • 2017 - Transformer: “Attention is All You Need” (Vaswani et al.)

  • 2018 - MLPerf: Production ML benchmarks

Each milestone script imports YOUR code from the TinyTorch package you built.

Quick Start#

Typical workflow:

# 1. Build the required modules (e.g., Foundation Tier for Milestone 03)
tito module complete 01  # Tensor
tito module complete 02  # Activations
tito module complete 03  # Layers
tito module complete 04  # Losses
tito module complete 05  # Autograd
tito module complete 06  # Optimizers
tito module complete 07  # Training

# 2. See what milestones you can run
tito milestone list

# 3. Get details about a specific milestone
tito milestone info 03

# 4. Run it!
tito milestone run 03

Essential Commands#

Discover Milestones#

List All Milestones

tito milestone list

Shows all 6 historical milestones with status:

  • 🔒 LOCKED - Need to complete required modules first

  • 🎯 READY TO RUN - All prerequisites met!

  • COMPLETE - You’ve already achieved this

Simple View (compact list):

tito milestone list --simple

Learn About Milestones#

Get Detailed Information

tito milestone info 03

Shows:

  • Historical context (year, researchers, significance)

  • Description of what you’ll recreate

  • Required modules with ✓/✗ status

  • Whether you’re ready to run it

Run Milestones#

Run a Milestone

tito milestone run 03

What happens:

  1. Checks prerequisites - Validates required modules are complete

  2. Tests imports - Ensures YOUR implementations work

  3. Shows context - Historical background and what you’ll recreate

  4. Runs the script - Executes the milestone using YOUR code

  5. Tracks achievement - Records your completion

  6. Celebrates! - Shows achievement message 🏆

Skip prerequisite checks (not recommended):

tito milestone run 03 --skip-checks

Track Progress#

View Milestone Progress

tito milestone status

Shows:

  • How many milestones you’ve completed

  • Your overall progress (%)

  • Unlocked capabilities

  • Next milestone ready to run

Visual Timeline

tito milestone timeline

See your journey through ML history in a visual tree format.

The 6 Milestones#

Milestone 01: Perceptron (1957) 🧠#

What: Frank Rosenblatt’s first trainable neural network

Requires: Module 01 (Tensor)

What you’ll do: Implement and train the perceptron that proved machines could learn

Historical significance: First demonstration of machine learning

Run it:

tito milestone info 01
tito milestone run 01

Milestone 02: XOR Crisis (1969) 🔀#

What: Solving the problem that stalled AI research

Requires: Modules 01-02 (Tensor, Activations)

What you’ll do: Use multi-layer networks to solve XOR - impossible for single-layer perceptrons

Historical significance: Minsky & Papert showed perceptron limitations; this shows how to overcome them

Run it:

tito milestone info 02
tito milestone run 02

Milestone 03: MLP Revival (1986)#

What: Backpropagation breakthrough - train deep networks on MNIST

Requires: Modules 01-07 (Complete Foundation Tier)

What you’ll do: Train a multi-layer perceptron to recognize handwritten digits (95%+ accuracy)

Historical significance: Rumelhart, Hinton & Williams (Nature, 1986) - the paper that reignited neural network research

Run it:

tito milestone info 03
tito milestone run 03

Milestone 04: CNN Revolution (1998) 👁#

What: LeNet - Computer Vision Breakthrough

Requires: Modules 01-09 (Foundation + Spatial/Convolutions)

What you’ll do: Build LeNet for digit recognition using convolutional layers

Historical significance: Yann LeCun’s breakthrough that enabled modern computer vision

Run it:

tito milestone info 04
tito milestone run 04

Milestone 05: Transformer Era (2017) 🤖#

What: “Attention is All You Need”

Requires: Modules 01-13 (Foundation + Architecture Tiers)

What you’ll do: Implement transformer architecture with self-attention mechanism

Historical significance: Vaswani et al. revolutionized NLP and enabled GPT/BERT/modern LLMs

Run it:

tito milestone info 05
tito milestone run 05

Milestone 06: MLPerf Benchmarks (2018) 🏆#

What: Production ML Systems

Requires: Modules 01-19 (Foundation + Architecture + Optimization Tiers)

What you’ll do: Optimize for production deployment with quantization, compression, and benchmarking

Historical significance: MLPerf standardized ML system benchmarks for real-world deployment

Run it:

tito milestone info 06
tito milestone run 06

Prerequisites and Validation#

How Prerequisites Work#

Each milestone requires specific modules to be complete. The run command automatically validates:

Module Completion Check:

tito milestone run 03

🔍 Checking prerequisites for Milestone 03...
   Module 01 - complete
   Module 02 - complete
   Module 03 - complete
   Module 04 - complete
   Module 05 - complete
   Module 06 - complete
   Module 07 - complete All prerequisites met!

Import Validation:

🧪 Testing YOUR implementations...
   Tensor import successful
   Activations import successful
   Layers import successful

✅ YOUR TinyTorch is ready!

If Prerequisites Are Missing#

You’ll see a helpful error:

 Missing Required Modules

Milestone 03 requires modules: 01, 02, 03, 04, 05, 06, 07
Missing: 05, 06, 07

Complete the missing modules first:
  tito module start 05
  tito module start 06
  tito module start 07

Achievement Celebration#

When you successfully complete a milestone, you’ll see:

╔════════════════════════════════════════════════╗
║  🎓 Milestone 03: MLP Revival (1986)           ║
║  Backpropagation Breakthrough                  ║
╚════════════════════════════════════════════════╝

🏆 MILESTONE ACHIEVED!

You completed Milestone 03: MLP Revival (1986)
Backpropagation Breakthrough

What makes this special:
• Every line of code: YOUR implementations
• Every tensor operation: YOUR Tensor class
• Every gradient: YOUR autograd

Achievement saved to your progress!

🎯 What's Next:
Milestone 04: CNN Revolution (1998)
Unlock by completing modules: 08, 09

Understanding Your Progress#

Three Tracking Systems#

TinyTorch tracks progress in three ways (all are related but distinct):

1. Module Completion (tito module status)

  • Which modules (01-20) you’ve implemented

  • Tracked in .tito/progress.json

  • Required for running milestones

2. Milestone Achievements (tito milestone status)

  • Which historical papers you’ve recreated

  • Tracked in .tito/milestones.json

  • Unlocked by completing modules + running milestones

3. Capability Checkpoints (tito checkpoint status) - OPTIONAL

  • Gamified capability tracking

  • Tracked in .tito/checkpoints.json

  • Purely motivational; can be disabled

Relationship Between Systems#

Complete Modules (01-07)
    ↓
Unlock Milestone 03
    ↓
Run: tito milestone run 03
    ↓
Achievement Recorded
    ↓
Capability Unlocked (optional checkpoint system)

Tips for Success#

1. Complete Modules in Order#

While you can technically skip around, the tier structure is designed for progressive learning:

  • Foundation Tier (01-07): Required for first milestone

  • Architecture Tier (08-13): Build on Foundation

  • Optimization Tier (14-19): Build on Architecture

2. Test as You Go#

Before running a milestone, make sure your modules work:

# After completing a module
tito module complete 05

# Test it works
python -c "from tinytorch import Tensor; print(Tensor([[1,2]]))"

3. Use Info Before Run#

Learn what you’re about to do:

tito milestone info 03  # Read the context first
tito milestone run 03   # Then run it

4. Celebrate Achievements#

Share your milestones! Each one represents recreating a breakthrough that shaped modern AI.

Troubleshooting#

“Import Error” when running milestone#

Problem: Module not exported or import failing

Solution:

# Re-export the module
tito module complete XX

# Test import manually
python -c "from tinytorch import Tensor"

“Prerequisites Not Met” but I completed modules#

Problem: Progress not tracked correctly

Solution:

# Check module status
tito module status

# If modules show incomplete, re-run complete
tito module complete XX

Milestone script fails during execution#

Problem: Bug in your implementation

Solution:

  1. Check error message for which module failed

  2. Edit modules/source/XX_name/ (NOT tinytorch/)

  3. Re-export: tito module complete XX

  4. Run milestone again

Next Steps#

Ready to Recreate ML History?

Start with the Foundation Tier and work toward your first milestone

Foundation Tier → Historical Context →

Every milestone uses YOUR code. Every achievement is proof you understand ML systems deeply. Build from scratch, recreate history, master the fundamentals.