Gaussian Variational Inference in high dimension

Tuesday, March 12, 2024

We consider the problem of approximating a high-dimensional distribution by a Gaussian one by minimizing the Kullback-Leibler divergence. The main result extends Katsevich and Rigollet (2023) and claims that the minimiser can be well approximated by the Gaussian distribution with the mean and variance as for the underlying measure. We also describe the accuracy of approximation and the range of applicability for such approximation in terms of efficient dimension. The obtained results can be used for analysis of various sampling scheme in optimization.

 

Post Talk Link:  Click Here 

Passcode: ^&5=SUZ*

Speaker/s

Vladimir Spokoiny received his PhD from the Department of Mechanics and Mathematics of the Lomonosov Moscow State University and did his Habilitation on “Statistical Experiments and Decisions: Asymptotic Theory” at the Humboldt University of Berlin. He is the head of the research group “Stochastic Algorithms & Nonparametric Statistics” of the Weierstrass Institute for Applied Analysis & Stochastics and Professor at the Humboldt University of Berlin. Spokoiny’s main research interests lie in adaptive nonparametric smoothing and hypothesis testing, high dimensional data analysis and nonlinear time series, with applications to financial and imaging sciences. Spokoiny is Fellow of Institute of Mathematical Statistics (IMS).

Related