Mathematics for IA 1 (2025) (Mathématiques pour l'Intelligence Artificielle 1)
Kernel and operator-theoretic methods in machine learning (2026)
- Lecture Notes (Work in progress, likely many typos and errors left!)
- List of papers to be presented by students:
- Concentration Inequalities and Moment Bounds for Sample Covariance Operators.
Vladimir Koltchinskii, Karim Lounici
https://arxiv.org/abs/1405.2468
- Non-asymptotic upper bounds for the reconstruction error of PCA.
Markus Reiss, Martin Wahl
https://arxiv.org/abs/1609.03779
- Optimally tackling covariate shift in RKHS-based nonparametric regression.
Cong Ma, Reese Pathak, Martin J. Wainwright
https://arxiv.org/abs/2205.02986
- Statistical Learning Theory for Neural Operators.
Niklas Reinhardt, Sven Wang, Jakob Zech
https://arxiv.org/abs/2412.17582
- Physics-informed machine learning as a kernel method.
Nathan Doumèche, Francis Bach, Claire Boyer, Gérard Biau
https://arxiv.org/abs/2304.13202
- Nyström Kernel Mean Embeddings.
Antoine Chatalic, Nicolas Schreuder, Lorenzo Rosasco, Alessandro Rudi
https://proceedings.mlr.press/v162/chatalic22a.html
- Efficient Numerical Integration in Reproducing Kernel Hilbert Spaces via Leverage Scores Sampling.
Antoine Chatalic, Nicolas Schreuder, Ernesto De Vito, Lorenzo Rosasco
https://jmlr.org/papers/v26/23-1551.html
- Random feature approximation for general spectral methods.
Mike Nguyen, Nicole Mücke
https://arxiv.org/abs/2506.16283
- The Exact Sample Complexity Gain from Invariances for Kernel Regression.
Behrooz Tahmasebi, Stefanie Jegelka
NeurIPS 2023 link