Dimensionality reduction (DR) techniques play an extremely critical role in the data mining and pattern recognition field. However, most DR approaches involve large-scale matrix computations, which cause too high running complexity to implement in the big data scenario efficiently. The recent developments in quantum information processing provide a novel path to alleviate this problem, where a potential quantum acceleration can be obtained comparing with the classical counterpart. Nevertheless, existing proposals for quantum DR methods faced the common dilemma of the nonlinear generalization owing to the intrinsic linear limitation of quantum computation. In this paper, an architecture to simulate the arbitrary nonlinear kernels on a universal quantum computer is illustrated and further propose the quantum kernel principal component analysis (QKPCA) algorithm. The key idea is employing the truncated Taylor expansion to approximate the arbitrary nonlinear kernel within the fixed error and then constructing the corresponding Hamiltonian simulation for the quantum phase estimation algorithm. It is demonstrated theoretically that the QKPCA is qualified for the nonlinear DR task while the exponential speedup is also maintained. In addition, this research has the potential ability to develop other quantum DR approaches and existing linear quantum machine learning models.
Focus on Quantum Machine Learning

Photo credit: iStock/Jakarin2521.
Gian Giacomo Guerreschi, Intel, USA
Roger Melko, University of Waterloo and the Perimeter Institute for Theoretical Physics, Canada
Nathan Wiebe, Microsoft, USA
Two of the most exciting areas of scientific and technological progress today centre around powerful advances in machine learning, and a rapid arms-race to build quantum hardware. The integration of the two fields is at the heart of a new interdisciplinary research effort, quantum machine learning. While there is a staggering yet unknown potential for disruptive advances, many fundamental challenges to combining these two technologies exist. This issue forms a very select resource spanning the very latest cutting edge research in this rapidly developing field. Through a single high quality collection, readers across the world will be able to see the current "state of the science" in an interdisciplinary area of growing importance within the machine learning and quantum technology communities. Specific topics to be covered include:
- Classical machine learning applied to quantum systems;
- Algorithms for machine learning on quantum hardware;
- Machine learning for quantum experiments;
- Frontiers in quantum machine learning.
The articles listed below are the first accepted contributions to the collection and further additions will appear on an ongoing basis.