Large foundation models are essential in the realm of generative AI, yet they confront significant limitations. These limitations encompass challenges in adapting to the unpredictable real world, including out-of-distribution data, noisy inputs, and security concerns. Secondly, given their substantial societal impact, it is imperative to promote interdisciplinary collaboration to evaluate their potential benefits and risks, enhance our understanding of human-AI interactions, and advocate for responsible AI adoption. In this talk, I will share my recent research, insights, and future plans in these critical areas, exploring ways to harness the power of large foundation models while addressing their constraints and fostering responsible AI integration in a rapidly evolving landscape.
Post Talk Link: Click Here
Passcode: $0?K=FB#
Dr. Jindong Wang is currently a Senior Researcher at Microsoft Research Asia. He obtained his Ph.D from Chinese Academy of Sciences in 2019. His research interest includes robust machine learning, transfer learning, semi-supervised learning, and federated learning. His recent interest is large language models. He has published over 50 papers with 10,000+ citations at leading venues such as ICLR, NeurIPS, TPAMI, TKDE, IJCV etc. He has several featured papers by Google scholar metrics, Huggingface, paperdigest, and Forbes. He received the best paper award at ICCSE’18 and IJCAI’19 federated learning workshop. In 2023, he was selected by Stanford University as one of the World’s Top 2% Scientists and one of the AI Most Influential Scholars by AMiner. He serves as the associate editor of IEEE Transactions on Neural Networks and Learning Systems (TNNLS), guest editor for ACM Transactions on Intelligent Systems and Technology (TIST), senior program committee member of IJCAI and AAAI, and reviewers for top venues like ICML, NeurIPS, ICLR, CVPR, TPAMI, AIJ etc. He leads several impactful open-source projects, including transferlearning, PromptBench, torchSSL, USB, personalizedFL, and robustlearn, which received over 16K stars on Github. He published a textbook called Introduction to Transfer Learning to help starters quickly learn transfer learning. He gave tutorials at IJCAI’22, WSDM’23, KDD’23, and AAAI’24.
Read More
Read More