Linear Algebra for Machine Learning
The language of AI. Vectors, matrices, transformations — everything your neural network is built from.
Chapter 1: Foundations: Vectors and Matrices
The building blocks. Every piece of data in AI lives as a vector or matrix.
Vectors and Vector Operations
Learn what vectors are, why they are the fundamental building blocks of machine learning, and how to manipulate them.
Matrices and Matrix Operations
Understand matrices as the core data structure of machine learning — from storing datasets to representing neural network layers.
Matrix Multiplication and Its Meaning
Go deep on matrix multiplication — the single most important operation in all of AI. Understand what it really does and why it matters.
Chapter 2: Linear Transformations
How matrices transform space — the geometric intuition behind neural networks.
Transpose and Inverse of Matrices
Two fundamental matrix operations used everywhere in ML.
Systems of Linear Equations
How solving Ax = b connects to finding optimal parameters.
Vector Spaces and Subspaces
Understanding the spaces your data lives in.
Linear Transformations
What matrices actually do — transform space.
Chapter 3: Advanced: Eigen-everything and Decompositions
Eigenvalues, SVD, and PCA — the tools for understanding high-dimensional data.