skip to main content

Kavli Foundation Special Symposium

Kavli Foundation Special Symposium

Michelle Girvan Michelle Girvan
University of Maryland, College Park
Eun Ah Kim Eun-Ah Kim
Cornell University
Roger Melko Roger Melko
University of Waterloo
John Preskill John Preskill
Patrick Riley Patrick Riley

Opening the black box: Improving knowledge-free machine learning with knowledge-based models

Michelle Girvan, University of Maryland, College Park

In recent years, machine learning methods such as "deep learning" have proven enormously successful for tasks such as image classification, voice recognition, and more. Despite their effectiveness for big-data classification problems, these methods have had limited success for time series prediction, especially for complex systems like those we see in weather, solar activity, and even brain dynamics. In this talk, I will discuss how a Reservoir Computer (RC) - a special kind of artificial neural network that offers a "universal" dynamical system - can draw on its own internal complex dynamics in order to forecast systems like the weather, far beyond the time horizon of other methods. Like many other machine learning architectures, the RC provides a knowledge-free approach because it builds forecasts purely from past measurements without any specific knowledge of the system dynamics. By building a new approach that judiciously combines the knowledge-free prediction of the RC with a knowledge-based model, we demonstrate a further, dramatic, improvement in forecasting complex systems. This hybrid approach can given us new insights into the weaknesses of our knowledge-based models and also reveal limitations in our machine learning system, guiding improvements in both knowledge-free and knowledge-based prediction techniques.

Machine Learning and Quantum Emergence

Eun-Ah Kim, Cornell University

Decades of efforts in improving computing power and experimental instrumentation were driven by our desire to better understand the complex problem of quantum emergence. However, increasing volume and variety of data made available to us today present new challenges. I will discuss how these challenges can be embraced and turned into opportunities by employing machine learning. It is important to note that the scientific questions in the field of electronic quantum matter require fundamentally new approaches to data science for two reasons: (1) quantum mechanical imaging of electronic behavior is probabilistic, (2) inference from data should be subject to fundamental laws governing microscopic interactions. Hence machine learning quantum emergence requires collective wisdom of data science and condensed matter physics. I will review rapidly developing efforts by the community in using machine learning to solve problems and gain new insight. I will then present my group’s results on the machine-learning-based analysis of complex experimental data on quantum matter.

Machine Learning and the Complexity of Quantum Simulation

Roger Melko, University of Waterloo

Computational approaches to condensed matter have long influenced the theoretical development of quantum many-body physics. For example, quantum Monte Carlo (QMC) ties our understanding of the computational efficiency of simulating quantum systems to the sign structure of the Hamiltonian. The Density Matrix Renormalization Group motivated the modern field of Tensor Networks (TNs), which relate computational efficiency to the entanglement entropy of a wavefunction. This trend now continues with the rapidly-developing field of machine learning, which has introduced a host of new strategies and architectures for the simulation and data-driven reconstruction of quantum many-body systems. Over the last three years, progress has been made in framing various machine learning approaches within the context of traditional methods such as QMC or TNs; however, it has also become apparent that some aspects differ substantially. In this talk I will provide an overview of the landscape of machine learning strategies in simulating quantum systems. I will speculate on how our theoretical framework of quantum many-body physics can influence the development of new machine learning strategies, and vice versa.

Quantum Computing: Current Status and Future Prospects

John Preskill, Caltech

Noisy Intermediate-Scale Quantum (NISQ) technology is now becoming available for the first time. Quantum computers with of order 100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away --- we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

Vignettes of machine learning in the natural sciences

Patrick Riley, Google

The promise of machine learning to aid in scientific discovery is now a frequent topic in journals and conferences. For the last 5 years, the Google Accelerated Science team has been partnering with scientists in a variety of natural science fields to make this promise a reality. I’ll sample a few of our achievements with examples in scientific computing and molecular design while also highlighting the current and future challenges in the effective use of machine learning in the sciences.