The videos of the lectures are recorded, see here. A more detailled set of Notes and videos is given below for each lecturer:
Gerard Ben Arous: video1,video2,video3.
Sebastian Goldt: video1, video2.
Alice Guionnet: Notes Random Matrix theory and Statistical Learning,video2,video3
Florent Krzakala: Notes Statistical Physics and Machine Learning 101 and video1, video2, video3.
Pierfrancesco Urbani: video1, video2. video3.
Matthieu Wyart: video1, video2
Student presenatations on day1 are here.
Detailled chedule for participant seminars
Thursday 6th
17:30 - 18:15 Maciej Koch-Janusz: Identifying the physically relevant degrees of freedom video
18:30 - 18:45 Chiara Marullo: Neural Networks beyond the Hebbian paradigm video
18:45 - 19:00 Jorge Fernandez-de-Cossio-Diaz: Interpretable representations and adversarial training of Restricted Boltzmann Machines video
19:00 - 19:15 Moshir Harsh: Place-cell' emergence and learning of invariant data with restricted Boltzmann machines video
Friday 7th
17:30 - 18:15 Raphael Berthier: Convergence of stochastic gradient descent under the noiseless model: rates depending on the regularity of the data. video
18:30 - 18:45 Koloskova Anastasia: Decentralized Optimization for Machine Learning video
18:45 - 19:00 Gluch Grzegorz Adam: Constructing a provably adversarially-robust classifier from a high accuracy one
video
19:00 - 19:15 Athina Monemvassitis: Some challenge is sampling
video
Monday 10th
17:30 - 18:15 Marylou Gabrié: Entropy Paper and the Blind Calibration, which push forward our ability to tackle learned matrices.
video
18:30 - 18:45 Giovani Picioli: The angular synchronization problem
video
18:45 - 19:00 Lorenzo Dall'Amico: Community detection in sparse time-evolving graphs with a dynamical Bethe-Hessian
video
19:00 - 19:15 Tiffany Joyce Vlaar: Constraint-based Regularization of Neural Networks.
video
Tuesday 11th
18:30 - 18:45 Antoine Maillard: Phase retrieval in high dimensions : statistical and computational phase transitions
video
18:45 - 19:00 Cédric Gerbelot-Barrillon: Asymptotic errors of convex generalized linear models beyond Gaussian matrices
video
19:00 - 19:15 Mirko Pieropan: Expectation propagation for the diluted bayesian classifier
video
Wednesday 12th
17:30 - 17:45 Francesca Mignacco: Dynamical mean field theory for stochastic gradient descent video.
17:45 - 18:00 Stefano Sarao Mannelli: Thresholds of descending algorithms in planted problems
video.
18:00 - 18:15 Antonio Sclocchi: Critical jammed phase with linear potential: spheres and perceptron
video.
18:30 - 18:45 Hugo Cui: Large deviations for the perceptron video.
18:45 - 19:00 Leonardo Petrini: Compressing invariant manifolds in neural nets
video.
19:00 - 19:15 Michiel Straat: Dynamics of on-line learning in two-layer neural networks in the presence of concept drift
video.
Thursday 13th
17:30 - 17:45 Manuela Girotti: A note on condition numbers for first-order optimization video
17:45 - 18:00 Maria Refinetti: Double Trouble in Double Descent: Bias and Variance(s) in the Lazy Regime video.
18:00 - 18:15 Stephane D’Ascoli: Triple descent and the two kinds of overfitting video.
18:30 - 18:45 Mario Geiger: Feature and lazy learning regimes video.
18:45 - 19:00 Ruben Ohana: Optical and recurrent random features video.
19:00 - 19:15 Jonathan Dong: Deep fluorescence microscopy with 2-layers neural networks video.
Code of conduct
We commit to providing a positive experience to all participants. We emphasize that we will not tolerate any power abuse and bullying including but not limited to the ones based on race, gender, religion, social class, gender identity, sexual orientation, and other forms of identity. We will proactively take a stance against any form of discrimination, harassment, and retaliation. We will follow the revised NeurIPS guidelines which can be reviewed at https://neurips.cc/public/CodeOfConduct.
Sponsors




