Séminaire Datashape
Ergodic learning and particle gradient descent generative model for point processes
15
Nov. 2023
Intervenant : Bartłomiej (Bartek) Błaszczyszyn
Institution : INRIA (Dyogene), ENS Paris
Heure : 11h00 - 12h00
Lieu : 2L8

Transmitted online on BBB.

Abstract:
Almost surely, any infinite realization of an ergodic point process,  say in Euclidean space, makes it possible to completely characterize the distribution of the point process and thus (in principle) to sample from this distribution new realizations. In practice, we only see a partial realization in a finite window. If this window (the number of points in it) is large enough, can we come up with an approximation of the unknown original distribution and use it to sample new realizations? Inspired by recent advances in gradient descent methods for maximum entropy models, in a joint work Brochard et al. (2022) we propose a method to generate similar point patterns by jointly moving particles of an initial  Poisson process realization towards a target counting measure. The overall quality of our model is evaluated on point processes with various geometric structures through spectral and topological data analysis, compared in particular to Tscheschel, Stoyan (2006).

References:
Tscheschel, A. and  Stoyan, D. (2006). Statistical reconstruction of random point patterns. Computational statistics & data analysis, 51(2), 859-871.
Brochard, A., Błaszczyszyn, B., Mallat, S. and Zhang, S. (2022). Particle gradient descent model for point process generation. Statistics and Computing 32, 1-25.

All (past and future) events