Math of diffusion

generative
fastai
talk
tutorial
Walk through the math of diffusion models from the ground up with no prerequisites beyond high school math required. This is Lesson 9B of fast.ai’s Practical Deep Learning for Coders Part 2, 2022.
Author

Wasim Lorgat

Published

October 23, 2022

Check out Lesson 9B: Math of Diffusion of fast.ai’s Practical Deep Learning for Coders Part 2, 2022 from the wonderful Tanishq and myself if you want to understand the math of diffusion but feel intimidated by the jargon. You’ll learn about the key equations underpinning diffusion models, with no prerequisites beyond high school math.

What you’ll learn

We walk through the math of diffusion models from the ground up, explaining the insights underlying the key equations in the work of Sohl-Dickstein et al. (2015) that originally discovered diffusion models.

By the end of the lesson you’ll have some understanding of the following key concepts and you’ll know how to recognize and interpret their symbols in research papers: probability density function (pdf), data distribution, forward process, reverse process, Markov process, Gaussian distribution, log likelihood, and evidence lower bound (ELBO).

We also touch on the more recent breakthroughs of Ho, Jain, and Abbeel (2020) which enabled even simpler and more powerful diffusion models.

You can discuss this lesson, and access links to all notebooks and resources from it, at this forum topic.

You don’t need a PhD

Here’s what Alex, a student of the course, had to say about the lesson:

@strickvl (Alex Strick van Linschoten) posts: Just here to say thank you to @ilovescience and @seem for the 9B lecture that dropped this morning. My first reaction on seeing something with the title "the math of diffusion" was to assume that 'oh, that's just something for all the smart people who have PhDs in mathematics on the course, and it'll probably be completely incomprehensible', but of course it's not that at all! I'm not all the way through, but so far I'm just really grateful how you both take things slowly and don't make any assumptions as to the background of your viewers. So thank you!

You definitely don’t need a PhD! In fact, the lesson came about because I felt the same way as Alex. I was frustrated at how difficult I found it to understand the math in diffusion papers.

Recorded at fast.ai HQ

Thanks to nudges from Jeremy, we went from an informal conversation, to a talk at the fast.ai unconference, to a recorded lesson – in a span of 4 days! Jeremy was kind enough to let us use his equipment and record at the fast.ai HQ.

Jeremy and Wasim behind a desk recording lesson 9b "math of diffusion". Blurred background.

Check out the other lesson resources

I’m grateful to be part of this amazing group of people developing fast.ai’s From Deep Learning Foundations to Stable Diffusion. Follow the tweet below to find more lesson resources from the team: Johno Whitaker, Pedro Cuenca, Tanishq Abraham, and of course Jeremy Howard.

References

Ho, Jonathan, Ajay Jain, and Pieter Abbeel. 2020. “Denoising Diffusion Probabilistic Models.” arXiv. http://arxiv.org/abs/2006.11239.
Sohl-Dickstein, Jascha, Eric A. Weiss, Niru Maheswaranathan, and Surya Ganguli. 2015. “Deep Unsupervised Learning Using Nonequilibrium Thermodynamics.” arXiv. https://doi.org/10.48550/arXiv.1503.03585.