Probability & Statistics for Machine Learning
Summary
Foundational probability concepts essential for ML: Bayes' theorem with real worked examples, Gaussian distributions and their parameters, covariance and Pearson correlation from scratch, and joint/conditional/marginal probabilities. Hands-on work involves computing Bayes' rule for practical scenarios, plotting Gaussians, analysing the Iris dataset with 2D histograms and probability matrices, and building an intuition for why probability is the language every ML algorithm speaks.
Materials
Probability Theory — The Language of Uncertainty
Bayes' theorem, Gaussian distributions, covariance, Pearson correlation, and why every ML model is secretly a probability machine.
Probability in Practice — Bayes, Gaussians & the Iris Dataset
Implement Bayes' theorem from scratch, plot Gaussian PDFs, compute covariance and PCC by hand and with NumPy, and explore feature distributions on the Iris dataset.
Includes notebook