Sparse Graph Learning Under Laplacian-Related Constraints

We consider the problem of learning a sparse undirected graph underlying a given set of multivariate data. We focus on graph Laplacian-related constraints on the sparse precision matrix that encodes conditional dependence between the random variables associated with the graph nodes. Under these cons...

Description complète

Enregistré dans:
Détails bibliographiques
Auteur principal: Jitendra K. Tugnait
Format: article
Langue:EN
Publié: IEEE 2021
Sujets:
Accès en ligne:https://doaj.org/article/57fb50f62d6341fc965c97b77f4da80b
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Description
Résumé:We consider the problem of learning a sparse undirected graph underlying a given set of multivariate data. We focus on graph Laplacian-related constraints on the sparse precision matrix that encodes conditional dependence between the random variables associated with the graph nodes. Under these constraints the off-diagonal elements of the precision matrix are non-positive (total positivity), and the precision matrix may not be full-rank. We investigate modifications to widely used penalized log-likelihood approaches to enforce total positivity but not the Laplacian structure. The graph Laplacian can then be extracted from the off-diagonal precision matrix. An alternating direction method of multipliers (ADMM) algorithm is presented and analyzed for constrained optimization under Laplacian-related constraints and lasso as well as adaptive lasso penalties. Numerical results based on synthetic data show that the proposed constrained adaptive lasso approach significantly outperforms existing Laplacian-based approaches. We also evaluate our approach on real financial data.