Neural Canonical Transformation with Symplectic Flows

Canonical transformation plays a fundamental role in simplifying and solving classical Hamiltonian systems. Intriguingly, it has a natural correspondence to normalizing flows with a symplectic constraint. Building on this key insight, we design a neural canonical transformation approach to automatic...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Shuo-Hui Li, Chen-Xiao Dong, Linfeng Zhang, Lei Wang
Formato: article
Lenguaje:EN
Publicado: American Physical Society 2020
Materias:
Acceso en línea:https://doaj.org/article/b7777a0445ae48eea297b3565afeaa27
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:Canonical transformation plays a fundamental role in simplifying and solving classical Hamiltonian systems. Intriguingly, it has a natural correspondence to normalizing flows with a symplectic constraint. Building on this key insight, we design a neural canonical transformation approach to automatically identify independent slow collective variables in general physical systems and natural datasets. We present an efficient implementation of symplectic neural coordinate transformations and two ways to train the model based either on the Hamiltonian function or phase-space samples. The learned model maps physical variables onto an independent representation where collective modes with different frequencies are separated, which can be useful for various downstream tasks such as compression, prediction, control, and sampling. We demonstrate the ability of this method first by analyzing toy problems and then by applying it to real-world problems, such as identifying and interpolating slow collective modes of the alanine dipeptide molecule and MNIST database images.