Spike Recovery from Large Random Tensors with Application to Machine Learning

Wednesday, September 13, 2023

In this talk, we delve into the asymptotic study of large asymmetric spiked tensor models, expanding upon classical principal component analysis to encompass high-order tensors. Our investigation unveils a fundamental connection: these models can be effectively studied by examining an equivalent random matrix constructed through contractions of the original tensor along its singular vectors. Within this framework, we explore three distinctive statistical models: 1) A rank-one spiked model, 2) Higher-rank models with tensor deflation, and 3) A nested matrix-tensor model that incorporates a joint low-rank matrix and tensor structure. Additionally, we demonstrate the practical relevance of our findings by applying them to the analysis of some classical learning methods, assuming the presence of a hidden low-rank tensor structure within the data.

 

Post Talk Link:  Click Here 

Passcode: &%11R2wV

Speaker/s

Mohamed El Amine Seddik earned his M.Sc. in Data Science and Machine Learning from the institute Mines-Telecom of Lille-Douai with the final year completed between Telecom ParisTech and ENS Paris-Saclay in 2017, he then completed a Ph.D. in Signal and Image Processing from Centrale-Supelec and the University Paris-Saclay in 2020, and conducted postdoctoral research at Ecole Polytechnique in 2021. He also worked as a researcher at the Mathematical and Algorithmics Research Lab within Huawei Technologies France in Paris. Currently, he is holding a senior researcher position at the Technology Innovation Institute in Abu Dhabi. His main research interests include machine learning, deep learning, random matrix theory, and random tensor theory.

Related