#| fig-cap: "**TinyTorch Module Flow.** The 20 modules progress through three tiers: Foundation (blue) builds core ML primitives, Architecture (purple) applies them to vision and language tasks, and Optimization (orange) makes systems production-ready."
graph LR
subgraph F["FOUNDATION (01-08)"]
direction TB
T["Tensor"] --> A["Activations"]
A --> L["Layers"]
L --> Loss["Losses"]
Loss --> Auto["Autograd"]
Auto --> Opt["Optimizers"]
Opt --> Train["Training"]
end
subgraph Arch["ARCHITECTURE (09-13)"]
direction TB
Data["DataLoader"]
Data --> Conv["CNNs"]
Data --> Tok["Tokenization"]
Tok --> Emb["Embed"]
Emb --> Att["Attention"]
Att --> Trans["Transform"]
end
subgraph Optim["OPTIMIZATION (14-19)"]
direction TB
Prof["Profiling"]
Prof --> Q["Quant"]
Prof --> C["Compress"]
Prof --> M["Memo"]
Prof --> Ac["Accel"]
Q --> Bench["Benchmark"]
C --> Bench
M --> Bench
Ac --> Bench
end
Train --> Data
Conv --> Prof
Trans --> Prof
Bench --> Cap["20: Capstone"]
%% Foundation - Light Blue (dark text readable)
style T fill:#bbdefb,stroke:#1976d2
style A fill:#bbdefb,stroke:#1976d2
style L fill:#bbdefb,stroke:#1976d2
style Loss fill:#bbdefb,stroke:#1976d2
style Auto fill:#bbdefb,stroke:#1976d2
style Opt fill:#bbdefb,stroke:#1976d2
style Train fill:#bbdefb,stroke:#1976d2
%% Architecture - Light Purple (dark text readable)
style Data fill:#e1bee7,stroke:#7b1fa2
style Conv fill:#e1bee7,stroke:#7b1fa2
style Tok fill:#e1bee7,stroke:#7b1fa2
style Emb fill:#e1bee7,stroke:#7b1fa2
style Att fill:#e1bee7,stroke:#7b1fa2
style Trans fill:#e1bee7,stroke:#7b1fa2
%% Optimization - Light Orange (dark text readable)
style Prof fill:#ffe0b2,stroke:#f57c00
style Q fill:#ffe0b2,stroke:#f57c00
style C fill:#ffe0b2,stroke:#f57c00
style M fill:#ffe0b2,stroke:#f57c00
style Ac fill:#ffe0b2,stroke:#f57c00
style Bench fill:#ffe0b2,stroke:#f57c00
%% Capstone - Light Gold (dark text readable)
style Cap fill:#fff9c4,stroke:#f9a825
%% Subgraph backgrounds
style F fill:#e3f2fd,stroke:#1976d2
style Arch fill:#f3e5f5,stroke:#7b1fa2
style Optim fill:#fff3e0,stroke:#f57c00
Big Picture
2-minute orientation before you begin building
This page answers: How do all the pieces fit together? Read this before diving into modules to build your mental map.
TinyTorch Overview
· AI-generated
The Journey: Foundation to Production
TinyTorch takes you from basic tensors to production-ready ML systems through 20 progressive modules. Here’s how they connect:
Three tiers, one complete system:
Foundation (blue): Build the core machinery—tensors hold data, activations add non-linearity, layers combine them, losses measure error, autograd computes gradients, optimizers update weights, and training orchestrates the loop. Each piece answers “what do I need to learn next?”
Architecture (purple): Apply your foundation to real problems. DataLoader feeds data efficiently, then you choose your path: Convolutions for images or Transformers for text (Tokenization → Embeddings → Attention → Transformers).
Optimization (orange): Make it fast. Profile to find bottlenecks, then apply quantization, compression, acceleration, or memoization. Benchmarking measures your improvements.