Eric Moulines

Adjunct Professor of Machine Learning

Research interests

Moulines’ current research topics include high-dimensional Monte Carlo sampling methods, stochastic optimization, and generative models (variational autoencoders, generative adversarial networks). He applies these various methods to uncertainty quantification, Bayesian inverse problems, and control of complex systems.

Email

In 1990, Moulines joined the Signal and Image Processing Department at Télécom ParisTech, where he was appointed full professor in 1996. In 2015, he moved to the Center for Applied Mathematics at Ecole Polytechnique, where he is currently professor of statistics. His areas of expertise include computational statistics (Monte Carlo simulations, stochastic optimization), probabilistic machine learning, statistical signal processing, and time series analysis (sequential Monte Carlo methods, nonlinear filtering). He is a EURASIP and IMS Fellow.

His current research themes aim to solve the challenges related to the need for rapid analysis of computational statistics created by ever-larger datasets. The four themes include: (1) Understanding and optimizing principled approximate inference in complex statistical models; (2) Develop principled statistical approaches for massive data sets and high-dimensional models; (3) Federated and distributed computational statistics; and (4) Theory and methodology for optimizing high-dimensional algorithms.

  • Degree in engineering from Ecole Polytechnique, France
  • Ph.D. in electrical engineering from Ecole Nationale Supérieure des Télécommunication, France
  • Best paper award from the IEEE Signal Processing Society (for publications in IEEE Trans. On Signal Processing) in 1997 and 2006
  • Silver Medal of the Centre National de Recherche Scientifique in 2010
  • Orange Prize of the French Academy of Sciences in 2011
  • Fellow of the IMS 2016
  • EURASIP technical achievement award 2020
  • Elected to the French Academy of Sciences in 2017

Moulines has published more than 120 articles in leading journals in signal processing, computational statistics, and applied probability, and more than 300 proceedings at major conferences on signal processing and machine learning.

  • Gersende Fort, Pierre Gach, and Eric Moulines. Fast incremental expectation maximization for finite-sum optimization: nonasymptotic convergence. Statistics and Computing, 31(4):1–24, 2021.
  • Aymeric Dieuleveut, Gersende Fort, Eric Moulines, and Genevieve Robin. Federated-EM with heterogeneity mitigation and variance reduction. In Advances in Neural Information Processing Systems, volume 35, 2021.
  • Alain Durmus, Eric Moulines, Alexey Naumov, Sergey Samsonov, Kevin Scaman, and Hoi-To Wai. Tight high probability bounds for linear stochastic approximation with fixed stepsize. In Advances in Neural Information Processing Systems, volume 34, 2021.
  • Alain Durmus, Eric Moulines, Eero Saksman, et al. Irreducibility and geometric ergodicity of Hamiltonian Monte Carlo. Annals of Statistics, 48(6):3545–3564, 2020.
  • Geneviève Robin, Olga Klopp, Julie Josse, Eric Moulines, and Robert´ Tibshirani. Main effects and interactions in mixed and incomplete data frames. Journal of the American Statistical Association, 115 (ja):1292–1303, 2020.
  • VAlain Durmus, Eric Moulines, and Marcelo Pereyra. Efficient Bayesian computation by proximal Markov chain Monte Carlo: when Langevin meets Moreau. SIAM J. Imaging Sci., 11(1):473– 506, 2018. ISSN 1936-4954.

Contact faculty affairs

Interested in working with our renowned faculty?
Fill out the below form and we will get back to you.