Michael Jordan

Laureate Professor and Honorary Program Director

Research interests

Jordan’s research interests bridge the computational, statistical, cognitive, biological and social sciences. Jordan developed recurrent neural networks as a cognitive model, and his work is less driven from a cognitive perspective and more from the background of traditional statistics. Jordan popularized Bayesian networks in the machine learning community and is known for pointing out links between machine learning and statistics. He has also been prominent in the formalization of variational methods for approximate inference and the popularization of the expectation maximization algorithm in machine learning.

Email

Michael I. Jordan has been a world-leading researcher in the field of statistical machine learning for nearly four decades. His contributions at the interface between computer science and statistics include the variational approach to statistical inference and learning, inference methods based on graphical models and Bayesian nonparametrics, and characterizations of trade-offs between statistical risk and computational complexity.

He has also worked at the interface between optimization and machine learning, where he is well known for his development of continuous-time models of gradient-based optimization and sampling, and his work on distributed systems for optimization.

Jordan has built bridges between machine learning and control theory, contributing to the theory of reinforcement learning, learning-based model predictive control, and optimality principles for human motor control. He has also led the way in bringing microeconomic concepts into contact with machine learning, developing learning methods that incentivize learners to share data, showing how contract theory can be employed for statistical inference, and contributing to the study of learning-based matching markets.

Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley. Jordan is a member of the National Academy of Sciences, a member of the National Academy of Engineering, a member of the American Academy of Arts and Sciences, and a Foreign Member of the Royal Society. He is a Fellow of the American Association for the Advancement of Science.

He was a Plenary Lecturer at the International Congress of Mathematicians in 2018. He received the Ulf Grenander Prize from the American Mathematical Society in 2021, the IEEE John von Neumann Medal in 2020, the IJCAI Research Excellence Award in 2016, the David E. Rumelhart Prize in 2015, and the ACM/AAAI Allen Newell Award in 2009.  He gave the Inaugural IMS Grace Wahba Lecture in 2022, the IMS Neyman Lecture in 2011, and an IMS Medallion Lecture in 2004. He was the inaugural winner of the World Laureates Association Prize in Computer Science or Mathematics in 2022.

  • Ph.D. in cognitive science from the University of California,  USA.
  • Master of Science in mathematics from Arizona State University, USA.
  • Bachelor of Science in psychology from Louisiana State University, USA.
  • Ulf Grenander Prize in Stochastic Theory and Modeling, American Mathematical Society, 2021.
  • Vannevar Bush Faculty Fellowship, 2021-2026.
  • Honorary Doctorate of Engineering and Technology, Yale University, 2020.
  • John von Neumann Medal, IEEE, 2020.
  • World’s Most Innovative People Award, World Summit on Innovation and Entrepreneurship, 2019.
  • Miller Research Professorship, University of California, Berkeley, 2017-2018.
  • IJCAI Award for Research Excellence, 2016.
  • David E. Rumelhart Prize, 2015.
  • Fellow, International Society for Bayesian Analysis (ISBA), 2014.
  • Fellow, Society for Industrial and Applied Mathematics (SIAM), 2012.
  • Fellow, Association for Computing Machinery (ACM), 2010.
  • Fellow, Cognitive Science Society (CSS), 2010.
  • ACM/AAAI Allen Newell Award, 2009.
  • Honorary Professor of Hebei University, China, 2009.
  • SIAM Activity Group on Optimization Prize, 2008.
  • Miller Research Professorship, University of California, Berkeley, 2008.
  • Fellow, American Statistical Association (ASA), 2007.
  • Fellow, American Association for the Advancement of Science (AAAS), 2006.
  • IEEE Neural Networks Pioneer Award, 2006.
  • Pehong Chen Distinguished Professorship, University of California, 2006.
  • Diane S. McEntyre Award for Excellence in Teaching, 2006.
  • Fellow, Institute of Mathematical Statistics (IMS), 2005.
  • Fellow, Institute of Electrical and Electronics Engineers (IEEE), 2005.
  • Fellow, American Association for Artificial Intelligence (AAAI), 2002.
  • MIT Class of 1947 Career Development Award, 1992 – 1995.
  • NSF Presidential Young Investigator Award, 1991 – 1996.
  • Publication Michael I. Jordan

Jordan is one of the leading figures in machine learning, and in 2016 Science named him the world’s most influential computer scientist.

  • J. D. Lee, M. Jordan, B. Recht, and M. Simchowitz, “Gradient Descent Only Converges to Minimizers,” in Proceedings of the 29th Conference on Learning Theory, {COLT} 2016, New York, USA, June 23-26, 2016, 2016, pp. 1246--1257.
  • X. Pan, M. Lam, S. Tu, D. Papailiopoulos, C. Zhang, M. Jordan, K. Ramchandran, C. Re, and B. Recht, “Cyclades: Conflict-free Asynchronous Machine Learning,” in Advances in Neural Information Processing Systems 29, 2016.
  • X. Pan, D. Papailiopoulos, S. Omyak, B. Recht, K. Ramchandran, and M. Jordan, “Parallel correlation clustering on big graphs,” in Advances in Neural Information Processing Systems 28, 2015, pp. 82--90.
  • X. Pan, S. Jegelka, J. E. Gonzalez, J. K. Bradley, and M. Jordan, “Parallel Double Greedy Submodular Maximization,” in Advances in Neural Information Processing Systems 27, 2014.
  • X. Pan, J. E. Gonzalez, S. Jegelka, T. Broderick, and M. Jordan, “Optimistic concurrency control for distributed unsupervised learning,” in Advances in Neural Information Processing Systems 26, 2013, pp. 1403--1411.
  • B. Taskar, S. Lacoste Julien, and M. Jordan, “Structured prediction, dual extragradient and Bregman projections,” J. Machine Learning Research, vol. 7, pp. 1627-1653, Dec. 2006.

Contact faculty affairs

Interested in working with our renowned faculty?
Fill out the below form and we will get back to you.