Séminaire Probabilités et Statistiques
Dynamical Analysis of Deep and Wide Neural Networks
June 2022
Intervenant : Karl Hajjar
Institution : LMO
Heure : 15h45 - 16h15
Lieu : 3L15

Neural networks have had tremendous empirical success in many different tasks but the reasons behind their performance remain unclear from a theoretical point of view. In this talk, we will briefly present the infinite-width limit of NNs have recently emerged as a way to shed light on some aspects  of the problem. In particular we will discuss recent results on the ``mean-field`` limit of NNs and put that in perspective of the literature on the Neural Tangent Kernel. Finally, we will present two series of work on infinitely-wide NNs: one on how to scale and adapt the mean-field limit to deep networks using Tensor Programs, and the second exploring the symmetries in the dynamics of infinitely wide two-layer networks.

All (past and future) events