Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization

The groundbreaking success of deep learning in many real-world tasks has triggered an intense effort to theoretically understand the power and limitations of deep learning in the training and generalization of complex tasks, so far with limited progress. In this work, we study the statistical mechan...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Qianyi Li, Haim Sompolinsky
Formato: article
Lenguaje:EN
Publicado: American Physical Society 2021
Materias:
Acceso en línea:https://doaj.org/article/510738e70abd43f7a350f976de4a2e33
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:510738e70abd43f7a350f976de4a2e33
record_format dspace
spelling oai:doaj.org-article:510738e70abd43f7a350f976de4a2e332021-12-02T19:12:56ZStatistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization10.1103/PhysRevX.11.0310592160-3308https://doaj.org/article/510738e70abd43f7a350f976de4a2e332021-09-01T00:00:00Zhttp://doi.org/10.1103/PhysRevX.11.031059http://doi.org/10.1103/PhysRevX.11.031059https://doaj.org/toc/2160-3308The groundbreaking success of deep learning in many real-world tasks has triggered an intense effort to theoretically understand the power and limitations of deep learning in the training and generalization of complex tasks, so far with limited progress. In this work, we study the statistical mechanics of learning in deep linear neural networks (DLNNs) in which the input-output function of an individual unit is linear. Despite the linearity of the units, learning in DLNNs is highly nonlinear; hence, studying its properties reveals some of the essential features of nonlinear deep neural networks (DNNs). Importantly, we exactly solve the network properties following supervised learning using an equilibrium Gibbs distribution in the weight space. To do this, we introduce the backpropagating kernel renormalization (BPKR), which allows for the incremental integration of the network weights layer by layer starting from the network output layer and progressing backward until the first layer’s weights are integrated out. This procedure allows us to evaluate important network properties, such as its generalization error, the role of network width and depth, the impact of the size of the training set, and the effects of weight regularization and learning stochasticity. BPKR does not assume specific statistics of the input or the task’s output. Furthermore, by performing partial integration of the layers, the BPKR allows us to compute the emergent properties of the neural representations across the different hidden layers. We propose a heuristic extension of the BPKR to nonlinear DNNs with rectified linear units (ReLU). Surprisingly, our numerical simulations reveal that despite the nonlinearity, the predictions of our theory are largely shared by ReLU networks of modest depth, in a wide regime of parameters. Our work is the first exact statistical mechanical study of learning in a family of deep neural networks, and the first successful theory of learning through the successive integration of degrees of freedom in the learned weight space.Qianyi LiHaim SompolinskyAmerican Physical SocietyarticlePhysicsQC1-999ENPhysical Review X, Vol 11, Iss 3, p 031059 (2021)
institution DOAJ
collection DOAJ
language EN
topic Physics
QC1-999
spellingShingle Physics
QC1-999
Qianyi Li
Haim Sompolinsky
Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
description The groundbreaking success of deep learning in many real-world tasks has triggered an intense effort to theoretically understand the power and limitations of deep learning in the training and generalization of complex tasks, so far with limited progress. In this work, we study the statistical mechanics of learning in deep linear neural networks (DLNNs) in which the input-output function of an individual unit is linear. Despite the linearity of the units, learning in DLNNs is highly nonlinear; hence, studying its properties reveals some of the essential features of nonlinear deep neural networks (DNNs). Importantly, we exactly solve the network properties following supervised learning using an equilibrium Gibbs distribution in the weight space. To do this, we introduce the backpropagating kernel renormalization (BPKR), which allows for the incremental integration of the network weights layer by layer starting from the network output layer and progressing backward until the first layer’s weights are integrated out. This procedure allows us to evaluate important network properties, such as its generalization error, the role of network width and depth, the impact of the size of the training set, and the effects of weight regularization and learning stochasticity. BPKR does not assume specific statistics of the input or the task’s output. Furthermore, by performing partial integration of the layers, the BPKR allows us to compute the emergent properties of the neural representations across the different hidden layers. We propose a heuristic extension of the BPKR to nonlinear DNNs with rectified linear units (ReLU). Surprisingly, our numerical simulations reveal that despite the nonlinearity, the predictions of our theory are largely shared by ReLU networks of modest depth, in a wide regime of parameters. Our work is the first exact statistical mechanical study of learning in a family of deep neural networks, and the first successful theory of learning through the successive integration of degrees of freedom in the learned weight space.
format article
author Qianyi Li
Haim Sompolinsky
author_facet Qianyi Li
Haim Sompolinsky
author_sort Qianyi Li
title Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
title_short Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
title_full Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
title_fullStr Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
title_full_unstemmed Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
title_sort statistical mechanics of deep linear neural networks: the backpropagating kernel renormalization
publisher American Physical Society
publishDate 2021
url https://doaj.org/article/510738e70abd43f7a350f976de4a2e33
work_keys_str_mv AT qianyili statisticalmechanicsofdeeplinearneuralnetworksthebackpropagatingkernelrenormalization
AT haimsompolinsky statisticalmechanicsofdeeplinearneuralnetworksthebackpropagatingkernelrenormalization
_version_ 1718377018063585280