Module 4

Statistics
& Probability.

The mathematical language of uncertainty. Distributions, hypothesis testing, Bayesian inference. Every machine learning model speaks this language — this module teaches you to hear it.

01

Why statistics is not optional

Linear algebra gives AI its structure. Calculus gives it its learning algorithm. Statistics gives it its epistemology — its theory of knowledge under uncertainty. A neural network's output is a probability distribution. A model's confidence is a statistical statement. Without statistics, you can run AI. With statistics, you can understand what it's telling you.

02

Distributions as worldviews

A probability distribution is a complete description of uncertainty about a quantity. The normal distribution assumes symmetric randomness. The Poisson assumes rare events. The Bernoulli assumes binary outcomes. Choosing the right distribution is choosing the right model of reality. Machine learning automates this choice — but understanding the options is understanding the assumptions your model is making.

P(A|B) = P(B|A) · P(A) / P(B) — Bayes' theorem
03

Bayesian inference

Bayes' theorem is the engine of probabilistic reasoning. You start with a prior belief. You observe evidence. You update your belief. This is precisely what machine learning does: start with random weights (prior), observe training data (evidence), update the weights (posterior). Bayesian thinking is not a metaphor for machine learning — it is the mathematical framework that is machine learning.

Status

This module is planned.

Statistics & Probability follows the Linear Algebra module. Interactive visualisations for distributions, hypothesis testing, and Bayesian updating are being designed.

Continue the sequence
Previous Module

← Linear Algebra as Transformation

Matrices as transformations. Eigenvectors. The dot product as attention mechanism.

Next Module

Music of Pattern →

Number in time. Fourier, sequence models, attention as counterpoint. Quadrivium III.