oct. 2025
Intervenant : | Quentin Giton |
Institution : | 3L15 |
Heure : | 16h20 - 16h40 |
Lieu : | 3L15 |
Generative models aim to learn—and sample from—the probability distribution underlying observed data. I'll give a quick tour of what's out there, then show how to build a generative model with optimal transport: we move any distribution towards the Gaussian by following the Wasserstein gradient flow of some free energy and we implement it using the JKO scheme. In practice, we are unable to exactly compute the next measure in the scheme, so we linearly parametrize the associated Kantorovich potential and learn its weights by SGD; the semi-discrete case drops out as a special, very interpretable instance. I'll present the construction and current results (no full error bounds yet) and show some animations that reveal the model's generative behavior