[Séminaire reporté]

Jeudi 26 mars 14:00-15:00 - Andreas Maurer - Munich

Résumé : Bounds for plug-in estimators
Functionals defined on a spaces of probability measures are often estimated by applying them to the empirical measure generated by an iid sample. Many properties of these plug-in estimators can be deduced from Lipschitz properties of the functional with respect to metrics on the probability measures, such as the total variation, Kolmogorov-Smirnov or Wasserstein metrics. The talk is about extending this idea from first- to second order Lipschitz properties.
I briefly review some standard results pertaining to the first order case. I then explain what I mean with higher order Lipschitz properties. Already in their weakest form the second order properties give a version of Bernsteins inequality and a result on normal approximation. If the functional is 2nd order Lipschitz w.r.t. the total variation and the Wasserstein-2 metrics, then the plug-in estimator approximates the functional uniformly over the measures generated by some function class, given reasonable bounds on the Gaussian width of the class evaluated at the sample. In machine learning this result allows to apply the popular method of Rademacher and Gaussian complexities to nonlinear objectives.
Applications include smoothened quantiles and functionals defined by kernels, like those giving rise to U- or V-statistics.

[Séminaire reporté]  Version PDF