Network Inference and Maximum Entropy Estimation on Information Diagrams

Abstract Maximum entropy estimation is of broad interest for inferring properties of systems across many disciplines. Using a recently introduced technique for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entro...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Elliot A. Martin, Jaroslav Hlinka, Alexander Meinke, Filip Děchtěrenko, Jaroslav Tintěra, Isaura Oliver, Jörn Davidsen
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2017
Materias:
R
Q
Acceso en línea:https://doaj.org/article/6a45a029cd324b9fa75c45f640084384
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:6a45a029cd324b9fa75c45f640084384
record_format dspace
spelling oai:doaj.org-article:6a45a029cd324b9fa75c45f6400843842021-12-02T11:53:08ZNetwork Inference and Maximum Entropy Estimation on Information Diagrams10.1038/s41598-017-06208-w2045-2322https://doaj.org/article/6a45a029cd324b9fa75c45f6400843842017-08-01T00:00:00Zhttps://doi.org/10.1038/s41598-017-06208-whttps://doaj.org/toc/2045-2322Abstract Maximum entropy estimation is of broad interest for inferring properties of systems across many disciplines. Using a recently introduced technique for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies, we show how this can be used to estimate the direct network connectivity between interacting units from observed activity. As a generic example, we consider phase oscillators and show that our approach is typically superior to simply using the mutual information. In addition, we propose a nonparametric formulation of connected informations, used to test the explanatory power of a network description in general. We give an illustrative example showing how this agrees with the existing parametric formulation, and demonstrate its applicability and advantages for resting-state human brain networks, for which we also discuss its direct effective connectivity. Finally, we generalize to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish significant advantages of this approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases.Elliot A. MartinJaroslav HlinkaAlexander MeinkeFilip DěchtěrenkoJaroslav TintěraIsaura OliverJörn DavidsenNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 7, Iss 1, Pp 1-15 (2017)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Elliot A. Martin
Jaroslav Hlinka
Alexander Meinke
Filip Děchtěrenko
Jaroslav Tintěra
Isaura Oliver
Jörn Davidsen
Network Inference and Maximum Entropy Estimation on Information Diagrams
description Abstract Maximum entropy estimation is of broad interest for inferring properties of systems across many disciplines. Using a recently introduced technique for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies, we show how this can be used to estimate the direct network connectivity between interacting units from observed activity. As a generic example, we consider phase oscillators and show that our approach is typically superior to simply using the mutual information. In addition, we propose a nonparametric formulation of connected informations, used to test the explanatory power of a network description in general. We give an illustrative example showing how this agrees with the existing parametric formulation, and demonstrate its applicability and advantages for resting-state human brain networks, for which we also discuss its direct effective connectivity. Finally, we generalize to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish significant advantages of this approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases.
format article
author Elliot A. Martin
Jaroslav Hlinka
Alexander Meinke
Filip Děchtěrenko
Jaroslav Tintěra
Isaura Oliver
Jörn Davidsen
author_facet Elliot A. Martin
Jaroslav Hlinka
Alexander Meinke
Filip Děchtěrenko
Jaroslav Tintěra
Isaura Oliver
Jörn Davidsen
author_sort Elliot A. Martin
title Network Inference and Maximum Entropy Estimation on Information Diagrams
title_short Network Inference and Maximum Entropy Estimation on Information Diagrams
title_full Network Inference and Maximum Entropy Estimation on Information Diagrams
title_fullStr Network Inference and Maximum Entropy Estimation on Information Diagrams
title_full_unstemmed Network Inference and Maximum Entropy Estimation on Information Diagrams
title_sort network inference and maximum entropy estimation on information diagrams
publisher Nature Portfolio
publishDate 2017
url https://doaj.org/article/6a45a029cd324b9fa75c45f640084384
work_keys_str_mv AT elliotamartin networkinferenceandmaximumentropyestimationoninformationdiagrams
AT jaroslavhlinka networkinferenceandmaximumentropyestimationoninformationdiagrams
AT alexandermeinke networkinferenceandmaximumentropyestimationoninformationdiagrams
AT filipdechterenko networkinferenceandmaximumentropyestimationoninformationdiagrams
AT jaroslavtintera networkinferenceandmaximumentropyestimationoninformationdiagrams
AT isauraoliver networkinferenceandmaximumentropyestimationoninformationdiagrams
AT jorndavidsen networkinferenceandmaximumentropyestimationoninformationdiagrams
_version_ 1718394871523311616