Overparameterized models, aka interpolators, are unstable. For example, the mininum-norm least square interpolator exhibits unbounded test errors when dealing with noisy data. In this talk, we study how ensemble stabilizes and thus improves the generalization performance, measured by the out-of-sample prediction risk, of an individual interpolator. We focus on bagged linear interpolators, as bagging is a popular randomization-based ensemble method that can be implemented in parallel. We introduce the multiplier-bootstrap-based bagged least square estimator, which can then be formulated as an average of the sketched least square estimators. The proposed multiplier bootstrap encompasses the classical bootstrap with replacement as a special case, along with a more intriguing variant which we call the Bernoulli bootstrap. We will also discuss several extensions.
Post Talk Link: Click Here
Passcode: WipvZ6@x
I am an Associate Professor of Statistics at the University of Toronto (UofT). I am a statistical learner and lead the StatsLE group. Recently, motivated by industrial challenges, we are interested in deep learning, ensemble learning, generative AI, reinforcement learning, transfer learning, and trustworthy AI. Previously, I was an associate research scholar at Princeton University, and then an assistant professor at UofT. I received my PhD from the University of North Carolina at Chapel Hill (UNC-CH), and my BS from the University of Science and Technology of China (USTC).
Read More
Read More