Sparse Graph Learning Under Laplacian-Related Constraints

We consider the problem of learning a sparse undirected graph underlying a given set of multivariate data. We focus on graph Laplacian-related constraints on the sparse precision matrix that encodes conditional dependence between the random variables associated with the graph nodes. Under these cons...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autor principal: Jitendra K. Tugnait
Formato: article
Lenguaje:EN
Publicado: IEEE 2021
Materias:
Acceso en línea:https://doaj.org/article/57fb50f62d6341fc965c97b77f4da80b
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:We consider the problem of learning a sparse undirected graph underlying a given set of multivariate data. We focus on graph Laplacian-related constraints on the sparse precision matrix that encodes conditional dependence between the random variables associated with the graph nodes. Under these constraints the off-diagonal elements of the precision matrix are non-positive (total positivity), and the precision matrix may not be full-rank. We investigate modifications to widely used penalized log-likelihood approaches to enforce total positivity but not the Laplacian structure. The graph Laplacian can then be extracted from the off-diagonal precision matrix. An alternating direction method of multipliers (ADMM) algorithm is presented and analyzed for constrained optimization under Laplacian-related constraints and lasso as well as adaptive lasso penalties. Numerical results based on synthetic data show that the proposed constrained adaptive lasso approach significantly outperforms existing Laplacian-based approaches. We also evaluate our approach on real financial data.