Enabling Fast, Robust, and Personalized Federated Learning

Saturday, December 16, 2023

In many large-scale machine learning applications, data is acquired and processed at the edge nodes of the network such as mobile devices, users’ devices, and IoT sensors. While distributed learning at the edge can enable a variety of new applications, it faces major systems bottlenecks that severely limit its reliability and scalability including system and data heterogeneity and communication bottleneck. In this talk, we focus on federated learning which is a new distributed machine learning approach, where a model is trained over a set of devices such as mobile phones, while keeping data localized. We first present a straggler-resilient federated learning scheme that uses adaptive node participation to tackle the challenge of system heterogeneity. We next present a robust optimization formulation for federated learning that enables us to address the data heterogeneity challenge in federated learning. We finally talk about a new algorithm for personalizing the learned models for different users.

Speaker/s

Ramtin Pedarsani is an associate professor in the ECE department at UCSB. He obtained his Ph.D. in Electrical Engineering and Computer Sciences from UC Berkeley in 2015. He received his M.Sc. degree at EPFL in 2011 and his B.Sc. degree at the University of Tehran in 2009. His research interests include machine learning, optimization, information and coding theory, and stochastic networks. He is the recipient of the Communications/Information Theory Society joint paper award in 2020 and the best paper award in the IEEE International Conference on Communications (ICC) in 2014.

Related