Optimizing a polynomial function on a quantum processor
Abstract The gradient descent method is central to numerical optimization and is the key ingredient in many machine learning algorithms. It promises to find a local minimum of a function by iteratively moving along the direction of the steepest descent. Since for high-dimensional problems the requir...
Saved in:
| Main Authors: | Keren Li, Shijie Wei, Pan Gao, Feihao Zhang, Zengrong Zhou, Tao Xin, Xiaoting Wang, Patrick Rebentrost, Guilu Long |
|---|---|
| Format: | article |
| Language: | EN |
| Published: |
Nature Portfolio
2021
|
| Subjects: | |
| Online Access: | https://doaj.org/article/ee291d3fa219422aa4c45b7a814186f3 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
New Properties on Degenerate Bell Polynomials
by: Taekyun Kim, et al.
Published: (2021) -
Simulating quantum computations with Tutte polynomials
by: Ryan L. Mann
Published: (2021) -
Machine learning of high dimensional data on a noisy quantum processor
by: Evan Peters, et al.
Published: (2021) -
A phononic interface between a superconducting quantum processor and quantum networked spin memories
by: Tomáš Neuman, et al.
Published: (2021) -
Laser-annealing Josephson junctions for yielding scaled-up superconducting quantum processors
by: Jared B. Hertzberg, et al.
Published: (2021)