This is a Matlab function implementing maximum a posteriori (MAP) estimation of
the precision matrix in a generative Gaussian Graphical Model (GGM), where a
Laplace prior (ie. L1-regularizer of the negative log-likelihood) has been
placed on the values of the precision matrix (also known as the `Graphical Lasso').
Formally, for a given empirical covariance matrix S and a regularizer
strength v, we seek to minimize
the following expression in terms of the matrix X:
log |X| - < S,X > - v||X||_1
In the above |X| denotes the determinant of X, < S,X >
is the matrix inner product of S and X, and ||X||_1 is the
sum of the absolute values of the elements of X (NOT the matrix L1-norm).
To form a proper
Precision/Covariance, this optimization must be performed subject to symmetry
of the matrix and a positive-definite constraint.
This constrained optimization is solved
using the method of (Banerjee, El Ghaoui, d'Aspremont, and Natsoulis. "Convex
optimization tehcniques for fitting sparse Gaussian graphical models", ICML
2006), where Block Coordinate Descent is applied to the convex dual, and each
column of the matrix is optimized in turn as a bound-constrained Quadratic Program.
There are 2 avialable options for solving the Quadratic Programs: If
the Quadratic Programming
in C (QPC) mex files are
available, then the qpas mex file from this software will be used.
Otherwise, the method formulates the block update as a Lasso optimization
problem and solves the Lasso optimization using the `Shooting' algorithm
(Knight and Fu, "Penalized regressions: the bridge vs the lasso", JCGS 98), as in
the Graphical
Lasso method.
This code was used in the paper Modeling changing dependency structure in
multivariate time series by Xiang Xuan and Kevin Murphy (ICML, 2007).