Jan. 2026
| Intervenant : | Aram-Alexandre Pooladian |
| Institution : | Yale University |
| Heure : | 14h00 - 15h00 |
| Lieu : | 3L8 |
The task of generative modeling typically concerns the transport of a single source distribution to a single target distribution on the basis of samples. In this work, we study the task of learning flows between families of distributions (i.e., many source measures to many target measures). Examples include families of means-covariances (e.g., Gaussians), or families of point-clouds. In both of these examples, the geometry of the data is relevant to the generative task, and we wish to preserve this along the flows. We introduce Wasserstein flow matching (WFM), which appropriately lifts flow matching onto families of distributions by appealing to the Riemannian nature of the Wasserstein geometry. Our algorithm leverages theoretical and computational advances in entropic optimal transport, as well as the attention mechanism in our neural network architecture. As applications, we demonstrate how to generate representations of granular cell states from single-cell genomics data (via Bures--Wasserstein FM) and synthesize cellular microenvironments from spatial transcriptomics datasets. Code is available at this https URL. This is joint work with Doron Haviv, Brandon Amos, and Dana Pe'er.