Optimizing a polynomial function on a quantum processor
Abstract The gradient descent method is central to numerical optimization and is the key ingredient in many machine learning algorithms. It promises to find a local minimum of a function by iteratively moving along the direction of the steepest descent. Since for high-dimensional problems the requir...
Enregistré dans:
Auteurs principaux: | Keren Li, Shijie Wei, Pan Gao, Feihao Zhang, Zengrong Zhou, Tao Xin, Xiaoting Wang, Patrick Rebentrost, Guilu Long |
---|---|
Format: | article |
Langue: | EN |
Publié: |
Nature Portfolio
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/ee291d3fa219422aa4c45b7a814186f3 |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Documents similaires
-
New Properties on Degenerate Bell Polynomials
par: Taekyun Kim, et autres
Publié: (2021) -
Simulating quantum computations with Tutte polynomials
par: Ryan L. Mann
Publié: (2021) -
Machine learning of high dimensional data on a noisy quantum processor
par: Evan Peters, et autres
Publié: (2021) -
A phononic interface between a superconducting quantum processor and quantum networked spin memories
par: Tomáš Neuman, et autres
Publié: (2021) -
Laser-annealing Josephson junctions for yielding scaled-up superconducting quantum processors
par: Jared B. Hertzberg, et autres
Publié: (2021)