Modern machine‑learning engineers don’t study calculus for the pleasure of proofs; we study it because every gradient, every optimizer, and every diffusion model is built on it. This 16‑post series walks from single‑variable limits all the way to Newton‑like second‑order methods, sprinkling PyTorch, NumPy, matplotlib, and even torchdiffeq demos along the way. Each post is two things at once:
Below is the full outline, grouped into four thematic parts.
# | Topic | Code‑Lab Highlights | Why ML cares |
---|---|---|---|
1 | Limits & Continuity | zoom‐in plots, ε–δ checker | numerical stability, vanishing grads |
2 | Derivatives | finite diff vs autograd on torch.sin | gradients drive learning |
3 | Fundamental Theorem | trapezoid & Simpson vs autograd.grad | loss ↔ derivatives ↔ integrals |
4 | 1‑D Optimization | hand‑rolled gradient descent | baby training loop |
5 | Taylor/Maclaurin | animated truncations | activation approx., positional encodings |
# | Topic | Code‑Lab Highlights | Why ML cares |
---|---|---|---|
6 | Vectors & ∇ | quiver of ∇ | visual back‑prop intuition |
7 | Jacobian & Hessian | tiny‑MLP Hessian spectrum | curvature, 2‑nd‑order opt. |
8 | Multiple Integrals | Monte‑Carlo 2‑D Gaussian | expected loss, ELBO |
9 | Change of Variables | affine flow, log‑det via autograd | flow‑based generative models |
10 | Line & Surface Integrals | streamplots, path work | RL trajectories, gradient flow |
# | Topic | Code‑Lab Highlights | Why ML cares |
---|---|---|---|
11 | Divergence, Curl, Laplacian | heat‑equation on grid | diffusion models, graph Laplacian |
12 | ODEs | train Neural‑ODE on spirals | continuous‑time nets |
13 | PDEs | finite‑diff wave equation | physics‑informed nets, vision kernels |
# | Topic | Code‑Lab Highlights | Why ML cares |
---|---|---|---|
14 | Functional Derivatives | gradient of | weight decay as variational prob. |
15 | Back‑prop from Scratch | 50‑line reverse‑mode engine | demystify autograd |
16 | Hessian‑Vector / Newton | SGD vs L‑BFGS, BFGS sketch | faster second‑order ideas |
This wraps the calculus arc, but not the journey. Future posts will spin these tools into:
If you’ve followed along, you now wield derivatives, integrals, flows, and variational principles — not as formalities, but as coding techniques that ship in real ML systems. Onward!