Optimizing a polynomial function on a quantum processor
Abstract The gradient descent method is central to numerical optimization and is the key ingredient in many machine learning algorithms. It promises to find a local minimum of a function by iteratively moving along the direction of the steepest descent. Since for high-dimensional problems the requir...
Guardado en:
Autores principales: | Keren Li, Shijie Wei, Pan Gao, Feihao Zhang, Zengrong Zhou, Tao Xin, Xiaoting Wang, Patrick Rebentrost, Guilu Long |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/ee291d3fa219422aa4c45b7a814186f3 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
New Properties on Degenerate Bell Polynomials
por: Taekyun Kim, et al.
Publicado: (2021) -
Simulating quantum computations with Tutte polynomials
por: Ryan L. Mann
Publicado: (2021) -
Machine learning of high dimensional data on a noisy quantum processor
por: Evan Peters, et al.
Publicado: (2021) -
A phononic interface between a superconducting quantum processor and quantum networked spin memories
por: Tomáš Neuman, et al.
Publicado: (2021) -
Laser-annealing Josephson junctions for yielding scaled-up superconducting quantum processors
por: Jared B. Hertzberg, et al.
Publicado: (2021)