**Machine learning for adaptive quantum measurement**

One of the most immediate practical applications of quantum information processing is performing precise quantum measurements. Important examples include the measurement of time with atomic clocks, spatial displacements with optical interferometry, and super-resolved imaging beyond the diffraction limit. These problems can be abstracted to known quantum mappings of input states, but with unknown parameters in the mapping function. Thus quantum measurement can be regarded as a quantum information problem in the following way: what is the best use of quantum resources and the best measurement to extract the most information possible about the unknown parameter? Heisenberg's uncertainty principle provides a fundamental bound on the amount of information a measurement can extract. Measurement schemes employing adaptive feedback constitute a promising strategy for reaching the Heisenberg limit. However, algorithms for finding optimal measurement strategies are hard to develop. We circumvent this challenge by exploiting machine learning.
We develop a quantum informational description of the adaptive quantum measurement problem by treating the input as a string of qubits, the output as a string of bits, and feedback control as a decision tree. The resulting non-convex optimization problem of finding optimal adaptive feedback protocols is then solved by adapting a particle swarm algorithm. We explain our technique for the case of interferometric phase estimation, which has such applications as atomic clocks and gravitational wave detection. Our algorithm autonomously learns to perform phase estimation based on experimental trial runs, which can be either simulated or performed using a real world experiment. Our framework does not require prior knowledge about the physical processes involved. In addition, the algorithm and can be trained on a real world experiment and then learns to account for systematic experimental imperfections, thereby making time-consuming error modelling and extensive calibration dispensable.