Using machine learning for measuring and controlling quantum system

Success in reliably measuring and controlling quantum systems triggered many technological and scientific breakthroughs recently: atomic clocks determine time via precise measurements of atomic oscillations; a pure quantum effect is used in current hard drives to read out the stored data; and gravitational wave detectors use interferometers to search for tiny deformations in space caused by gravitational waves. At the fundamental level, measurement precision is limited by Heisenberg's uncertainty principle but even reaching a precision close to the Heisenberg bound is far beyond existing technology because of source and detector limitations. Adaptive measurement strategies are promising because they can greatly reduce the technological requirements. However, finding good adaptive protocols, even for simple quantum systems, is very hard and often involves clever guesswork. Fortunately, the area of artificial intelligence suggests a promising approach. The advantage of machine learning is that the program learns from its own performance and tries to devise better problem-solving strategies for the future. In my talk, I will describe a way that machine learning can be used to devise adaptive quantum measurement and quantum control protocols. I will explain our technique using the example of measuring an interferometric phase shift, which is important for applications such as gravitational wave detection, where the wave imposes an unknown phase difference between the two arms of a Mach-Zehnder Interferometer.