The study of symmetries in physics has revolutionized our understanding of the world. Inspired by this, the development of methods to incorporate internal (Gauge) and external (space-time) symmetries into machine learning models is a very active field of research. We will present our work on invariant generative models and its applications to lattice-QCD and molecular dynamics simulations. In the molecular dynamics front, we’ll talk about how we constructed permutation and translation-invariant normalizing flows on a torus for free-energy estimation. In lattice-QCD, we’ll present our work that introduced the first U(N) and SU(N) Gauge-equivariant normalizing flows for pure Gauge simulations and its extensions to incorporate fermions.
Danilo J. Rezende is a senior staff researcher and lead of the Generative Models and Inference group at DeepMind, London. For the past 12 years his research has focused on scalable inference and generative models applied to reinforcement learning, modelling of complex data such as medical images, videos, 3D scene geometry and complex physical systems. He has co-authored more than 90 papers and patents, amongst which a few highly cited papers on approximate inference and modelling with neural networks (such as deep latent gaussian models, normalizing flows and interaction networks). Highlights of his recent work at the intersection of AI and physics include equivariant normalizing flows for lattice-QCD and molecular dynamics. Rezende is engaged in promoting the alliance between ML/AI, physics, and geometry. He holds a Bachelor of Arts in Physics and an M.Sc. in Theoretical Physics from Ecole Polytechnique (Palaiseau, France) and the Institute of Theoretical Physics (SP, Brazil). Once an aspiring Ph.D. in theoretical physics at the Centre de Physique Théorique in Marseille, France he switched to a Ph.D. in Computational Neuroscience at Ecole Polytechnique Federale de Lausanne (Lausanne, Switzerland), where he studied computational/statistical models of learning and sensory fusion.
Read More
Read More