CV Home Publications Professional Activities Softwares Awards Collaborators Contact Album
I have developed the SLEP (Sparse Learning with Efficient Projections) Package.
The current SLEP package (version 4.0) supports the optimization with the following penalty:
Lasso. It yields a sparse solution by applying the L1 norm.
Fused Lasso. It enforces sparsity in both the coefficients and their successive differences. It is desirable for applications with features ordered in some meaningful way.
Group Lasso. It yields a solution with group sparsity by using the L1/Lq norm (q>1). It is suitable to applications with pre-specified non-overlapping groups.
Sparse Group Lasso. It yields a solution with both within- and between- group sparsity by using the L1 norm and L1/Lq norm. It is desirable for selecting a small number of groups and features.
Tree Structured Group Lasso. It is suitable to the applications where the features follow certain tree structure. In this penalty, we first compute the L2 norm for each tree node, and then perform a weighted summarization of these L2 norms.
Overlapping Group Lasso. It is suitable to the applications where the features form certain groups and one feature might appear in several groups. In this penalty, we first compute the L2 norm for each node, and then perform a weighted summarization of these L2 norms.
Nuclear Norm. It yields a low rank solution. It is applicable to the applications where the model parameters are of low rank structure.
Sparse Inverse Covariance Estimation. It computes a sparse inverse covariance estimation of the empirical covariance matrix. It can help reveal the relationship among different features.
The current version of the SLEP package is implemented in Matlab, with the associated Euclidean projections in C. New functions/features are being added into the SLEP package. The C version shall be distributed soon.