3
Linear Regression
Summary
Linear regression from theory to complete PyTorch implementation: the hypothesis function y = wTx + b, mean squared error loss, gradient descent optimisation, z-score feature normalisation, learning rate experiments, weight interpretation on the Diabetes dataset, and extending to 5th-order polynomial regression with L2 (Ridge) regularisation to control overfitting.
Materials
THEORY
Linear Regression — Predicting Numbers with Lines
The hypothesis function, MSE loss, gradient descent from first principles, feature normalisation, the bias-variance tradeoff, and L2 regularisation.
PRACTICE
Building Linear Regression from Scratch in PyTorch
Custom LinearRegression as nn.Module, manual MSE and gradient descent, z-score normalisation, the Diabetes dataset, learning rate experiments, and weight interpretation.
Includes notebook