Matrix Factorization, Algorithms, Applications, and Avaliable packages
Matrix Decompositions has
a long history and generally centers around a set of known factorizations such as LU, QR, SVD and eigendecompositions. More recent
factorizations have seen the light of the day with work started with the advent of NMF, k-means and related algorithm [1].
However, with the advent of new methods based on random projections and convex optimization that started in part in the compressive
sensing literature, we are seeing another surge of very diverse algorithms dedicated to many different kinds of matrix factorizations with new constraints based on rank and/or positivity and/or sparsity,… As a result of this large increase in interest,
I have decided to keep a list of them here following the success of the big
picture in compressive sensing.
The sources for this list include the following most excellent sites: Stephen
Becker’s page, Raghunandan H. Keshavan‘ s page, Nuclear
Norm and Matrix Recovery through SDP by Christoph Helmberg, Arvind
Ganesh’s Low-Rank Matrix Recovery and Completion via Convex
Optimization who provide more in-depth additional information. Additional codes were featured also on Nuit
Blanche. The following people provided additional inputs: Olivier Grisel, Matthieu
Puigt.
Most of the algorithms listed below generally rely on using the nuclear norm as a proxy to the rank functional. It
may not be optimal. Currently, CVX ( Michael
Grant and Stephen Boyd) consistently allows one to explore other
proxies for the rank functional such as thelog-det as
found by Maryam Fazell, Haitham
Hindi, Stephen Boyd. ** is used to show that the algorithm uses
another heuristic than the nuclear norm.
In terms of notations, A refers to a matrix, L refers to a low rank matrix, S a sparse one and N to a noisy one. This page lists the different codes that implement the following matrix factorizations: Matrix Completion, Robust
PCA , Noisy Robust PCA, Sparse PCA, NMF, Dictionary Learning, MMV, Randomized Algorithms and other factorizations. Some of these toolboxes can sometimes implement several of these decompositions and are listed accordingly. Before I list algorithm here, I generally
feature them on Nuit Blanche under the MF tag: http://nuit-blanche.blogspot.com/search/label/MF or. you
can also subscribe to the Nuit Blanche feed,
Matrix Completion, A = H.*L with H a known mask, L unknown solve
for L lowest rank possible
The idea of this approach is to complete the unknown coefficients of a matrix based on the fact that the matrix is low rank:
- OptSpace: Matrix
Completion from a Few Entries by Raghunandan H. Keshavan, Andrea
Montanari, and Sewoong Oh - LMaFit: Low-Rank Matrix Fitting
- ** Penalty
Decomposition Methods for Rank Minimization by Zhaosong Lu and Yong
Zhang.The attendant MATLAB code is here. - Jellyfish: Parallel
Stochastic Gradient Algorithms for Large-Scale Matrix Completion, B. Recht, C. Re, Apr 2011 - GROUSE:
Online Identification and Tracking of Subspaces from Highly Incomplete Information, L. Balzano, R. Nowak, B. Recht, 2010 - SVP: Guaranteed
Rank Minimization via Singular Value Projection, R. Meka, P. Jain, I.S.Dhillon, 2009 - SET:
SET: an algorithm for consistent matrix completion, W. Dai, O. Milenkovic, 2009 - NNLS: An
accelerated proximal gradient algorithm for nuclear norm regularized least squares problems, K. Toh, S. Yun, 2009 - FPCA: Fixed point
and Bregman iterative methods for matrix rank minimization, S. Ma, D. Goldfard, L. Chen, 2009 - SVT: A singular value thresholding
algorithm for matrix completion, J-F Cai, E.J. Candes, Z. Shen, 2008
Noisy Robust PCA, A = L + S + N with L, S, N unknown, solve
for L low rank, S sparse, N noise
- GoDec :
Randomized Low-rank and Sparse Matrix Decomposition in Noisy Case - ReProCS: The Recursive
Projected Compressive Sensing code (example)
Robust PCA : A = L + S with L, S, N unknown, solve for L low
rank, S sparse
- Robust PCA :
Two Codes that go with the paper “Two
Proposals for Robust PCA Using Semidefinite Programming.” by MichaleI
Mccoy andJoel Tropp - SPAMS (SPArse
Modeling Software) - ADMM: Alternating
Direction Method of Multipliers ‘‘Fast Automatic
Background Extraction via Robust PCA’ by Ivan Papusha. The poster
is here. The matlab implementation is here. - PCP: Generalized
Principal Component Pursuit - Augmented Lagrange Multiplier (ALM) Method [exact ALM - MATLAB zip]
[inexact ALM - MATLABzip], Reference
- The Augmented Lagrange Multiplier Method
for Exact Recovery of Corrupted Low-Rank Matrices, Z. Lin, M. Chen, L. Wu, and Y. Ma (UIUC Technical Report UILU-ENG-09-2215, November 2009) - Accelerated Proximal Gradient , Reference - Fast
Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix, Z. Lin, A. Ganesh, J. Wright, L. Wu, M. Chen, and Y. Ma (UIUC Technical Report UILU-ENG-09-2214, August 2009)[full SVD version - MATLAB zip]
[partial SVD version - MATLAB zip] - Dual Method [MATLAB zip],
Reference - Fast Convex Optimization
Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix, Z. Lin, A. Ganesh, J. Wright, L. Wu, M. Chen, and Y. Ma (UIUC Technical Report UILU-ENG-09-2214, August 2009). - Singular Value Thresholding [MATLAB zip].
Reference - A Singular Value Thresholding Algorithm
for Matrix Completion, J. -F. Cai, E. J. Candès, and Z. Shen (2008). - Alternating Direction Method [MATLAB zip]
, Reference - Sparse and Low-Rank Matrix
Decomposition via Alternating Direction Methods, X. Yuan, and J. Yang (2009). - LMaFit: Low-Rank Matrix Fitting
- Bayesian robust
PCA - Compressive-Projection
PCA (CPPCA)
Sparse PCA: A = DX with unknown D and X, solve for sparse
D
Sparse PCA on wikipedia
- R. Jenatton, G. Obozinski, F. Bach. Structured Sparse Principal Component Analysis. International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf]
[code] - SPAMs
- DSPCA: Sparse
PCA using SDP . Code ishere. - PathPCA: A fast greedy algorithm for Sparse PCA. The code is here.
Dictionary Learning: A = DX with unknown D and X, solve for sparse
X
Some implementation of dictionary learning implement the NMF
- Online
Learning for Matrix Factorization and Sparse Coding by Julien Mairal, Francis
Bach, Jean Ponce,Guillermo
Sapiro [The code is released as SPArse Modeling Softwareor SPAMS] - Dictionary
Learning Algorithms for Sparse Representation (Matlab implementation of FOCUSS/FOCUSS-CNDL
is here) - Multiscale
sparse image representation with learned dictionaries [Matlab implementation of the K-SVD
algorithm is here, a newer implementation by Ron Rubinstein is here ] - Efficient
sparse coding algorithms [ Matlab code
is here ] url=http%3A%2F%2Fwww2.imm.dtu.dk%2Fpubdb%2Fviews%2Fedoc_download.php%2F4659%2Fpdf%2Fimm4659.pdf" style="color:rgb(41,112,166); text-decoration:none; margin:0px; padding:0px; word-wrap:break-word; border:none">Shift
. Matlab implemention is here
Invariant Sparse Coding of Image and Music Data- Shift-invariant
dictionary learning for sparse representations: extending K-SVD. - Thresholded
Smoothed-L0 (SL0) Dictionary Learning for Sparse Representations by Hadi Zayyani, Massoud
Babaie-Zadeh and Remi Gribonval. - Non-negative
Sparse Modeling of Textures (NMF) [Matlab implementation of NMF
(Non-negative Matrix Factorization) and NTF (Non-negative Tensor), a faster implementation of NMF can be found here,
here is a more recent Non-Negative Tensor Factorizations package]
NMF: A = DX with unknown D and X, solve for elements of D,X
> 0
Non-negative
Matrix Factorization (NMF) on wikipedia
- HALS: Accelerated
Multiplicative Updates and Hierarchical ALS Algorithms for Nonnegative Matrix Factorization by Nicolas
Gillis, François Glineur. - SPAMS (SPArse
Modeling Software) by Julien Mairal, Francis
Bach, Jean Ponce,Guillermo
Sapiro - NMF: C.-J. Lin. Projected
gradient methods for non-negative matrix factorization.issn=08997667" style="color:rgb(41,112,166); text-decoration:none; margin:0px; padding:0px; word-wrap:break-word; border:none">Neural
, 19(2007), 2756-2779.
Computation - Non-Negative Matrix Factorization: This
page contains an optimized C implementation of the Non-Negative Matrix Factorization (NMF) algorithm, described in [Lee
& Seung 2001]. We implement the update rules that minimize a weighted SSD error metric. A detailed description of weighted NMF can be found in[Peers
et al. 2006]. - NTFLAB for
Signal Processing, Toolboxes for NMF (Non-negative Matrix Factorization) and NTF (Non-negative Tensor Factorization) for BSS (Blind Source Separation) - Non-negative
Sparse Modeling of Textures (NMF) [Matlab implementation of NMF
(Non-negative Matrix Factorization) and NTF (Non-negative Tensor), a faster implementation of NMF can be found here,
here is a more recent Non-Negative Tensor Factorizations package]
Multiple Measurement Vector (MMV) Y = A X with unknown X and rows
of X are sparse.
- T-MSBL/T-SBL by Zhilin
Zhang - Compressive
MUSIC with optimized partial support for joint sparse recovery by Jong
Min Kim, Ok Kyun Lee, Jong
Chul Ye [no code] - The
REMBO Algorithm Accelerated Recovery of Jointly Sparse Vectorsby Moshe Mishali and Yonina C. Eldar [ no code]
Blind Source Separation (BSS) Y = A X with unknown A and X and
statistical independence between columns of X or subspaces of columns of X
Include Independent Component Analysis (ICA), Independent Subspace Analysis (ISA), and Sparse Component Analysis (SCA). There are many available codes for ICA and some for SCA. Here is a non-exhaustive list of some
famous ones (which are not limited to linear instantaneous mixtures). TBC
ICA:
- ICALab:
url=http%3A%2F%2Fwww%2Ebsp%2Ebrain%2Eriken%2Ejp%2FICALAB%2F&urlhash=8N_s&_t=tracking_disc" style="color:rgb(41,112,166); text-decoration:none; margin:0px; padding:0px; word-wrap:break-word; border:none">http://www.bsp.brain.riken.jp/ICALAB/
- BLISS softwares:
url=http%3A%2F%2Fwww%2Elis%2Einpg%2Efr%2Fpages_perso%2Fbliss%2Fdeliverables%2Fd20%2Ehtml&urlhash=5cMy&_t=tracking_disc" style="color:rgb(41,112,166); text-decoration:none; margin:0px; padding:0px; word-wrap:break-word; border:none">http://www.lis.inpg.fr/pages_perso/bliss/deliverables/d20.html
- MISEP: http://www.lx.it.pt/~lbalmeida/ica/mitoolbox.html
- Parra and Spence’s frequency-domain convolutive ICA:http://people.kyb.tuebingen.mpg.de/harmeling/code/convbss-0.1.tar
- C-FICA: http://www.ast.obs-mip.fr/c-fica
SCA:
- DUET:
url=http%3A%2F%2Fsparse%2Eucd%2Eie%2Fpublications%2Frickard07duet%2Epdf&urlhash=fZ9d&_t=tracking_disc" style="color:rgb(41,112,166); text-decoration:none; margin:0px; padding:0px; word-wrap:break-word; border:none">http://sparse.ucd.ie/publications/rickard07duet.pdf
(the
matlab code is given at the end of this pdf document) - LI-TIFROM: http://www.ast.obs-mip.fr/li-tifrom
Randomized Algorithms
These algorithms uses generally random projections to shrink very large problems into smaller ones that can be amenable to traditional matrix factorization methods.
Resource
Randomized algorithms for matrices and data by Michael W. Mahoney
Randomized Algorithms for Low-Rank Matrix
Decomposition
- Randomized PCA
- Randomized Least Squares: Blendenpik( http://pdos.csail.mit.edu/~petar/papers/blendenpik-v1.pdf )
Other factorization
D(T(.)) = L + E with unknown L, E and unknown transformation T and solve
for transformation T, Low Rank L and Noise E
- RASL:
Robust Batch Alignment of Images by Sparse and Low-Rank Decomposition - TILT:
Transform Invariant Low-rank Textures
Frameworks featuring advanced Matrix factorizations
For the time being, few have integrated the most recent factorizations.
- Scikit
Learn (Python) - Matlab
Toolbox for Dimensionality Reduction (Probabilistic PCA, Factor Analysis (FA)…) - Orange (Python)
- pcaMethods—a bioconductor package
providing PCA methods for incomplete data. R Language
GraphLab / Hadoop
- Danny Bickson keeps
a blog on GraphLab.
Books
Example of use
- CS:
Low Rank Compressive Spectral Imaging and a multishot CASSI - CS:
Heuristics for Rank Proxy and how it changes everything…. - Tennis
Players are Sparse !
Sources
Arvind Ganesh’s Low-Rank
Matrix Recovery and Completion via Convex Optimization
- Raghunandan H. Keshavan‘
s list - Stephen
Becker’s list - Nuclear
Norm and Matrix Recovery through SDP by Christoph Helmberg - Nuit Blanche
Relevant links
Reference:
A
Unified View of Matrix Factorization Models by Ajit P. Singh and Geoffrey J. Gordon
Matrix Factorization, Algorithms, Applications, and Avaliable packages的更多相关文章
- Matrix Factorization SVD 矩阵分解
Today we have learned the Matrix Factorization, and I want to record my study notes. Some kownledge ...
- 关于NMF(Non-negative Matrix Factorization )
著名的科学杂志<Nature>于1999年刊登了两位科学家D.D.Lee和H.S.Seung对数学中非负矩阵研究的突出成果.该文提出了一种新的矩阵分解思想――非负矩阵分解(Non-nega ...
- 机器学习技法:15 Matrix Factorization
Roadmap Linear Network Hypothesis Basic Matrix Factorization Stochastic Gradient Descent Summary of ...
- 《Non-Negative Matrix Factorization for Polyphonic Music Transcription》译文
NMF(非负矩阵分解),由于其分解出的矩阵是非负的,在一些实际问题中具有非常好的解释,因此用途很广.在此,我给大家介绍一下NMF在多声部音乐中的应用.要翻译的论文是利用NMF转录多声部音乐的开山之作, ...
- 机器学习技法笔记:15 Matrix Factorization
Roadmap Linear Network Hypothesis Basic Matrix Factorization Stochastic Gradient Descent Summary of ...
- Non-negative Matrix Factorization 非负矩阵分解
著名的科学杂志<Nature>于1999年刊登了两位科学家D.D.Lee和H.S.Seung对数学中非负矩阵研究的突出成果.该文提出了一种新的矩阵分解思想――非负矩阵分解(Non-nega ...
- 【RS】Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering - 基于拉普拉斯分布的稀疏概率矩阵分解协同过滤
[论文标题]Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering ...
- 【RS】List-wise learning to rank with matrix factorization for collaborative filtering - 结合列表启发排序和矩阵分解的协同过滤
[论文标题]List-wise learning to rank with matrix factorization for collaborative filtering (RecSys '10 ...
- 【RS】Matrix Factorization Techniques for Recommender Systems - 推荐系统的矩阵分解技术
[论文标题]Matrix Factorization Techniques for Recommender Systems(2009,Published by the IEEE Computer So ...
随机推荐
- [NewCoder]复杂链表的复制
看下面一个链表结点的定义: struct ComplexListNode { int val; struct ComplexListNode *next; struct ComplexListNode ...
- Cloudera Manager Service Monitor 定期挂掉问题排查
显示:查询 Service Monitor 时发生内部错误(Error sending messages to firehose: mgmt-SERVICEMONITOR-) 1.初步排查出是smon ...
- IOS 表视图(UITableVIew)的使用方法(7)表视图的编辑功能(拖拉调整排序位置)
除了每个单元行左边的删除和新增图标,UITableView还支持在单元行的右侧显示一个供用户拖拉调整排序位置的控件. 不过如果要显示此控件,UITableView的数据源需要实现以下的方法. -(vo ...
- spring学习总结(mybatis,事务,测试JUnit4,日志log4j&slf4j,定时任务quartz&spring-task,jetty,Restful-jersey等)
在实战中学习,模仿博客园的部分功能.包括用户的注册,登陆:发表新随笔,阅读随笔:发表评论,以及定时任务等.Entity层设计3张表,分别为user表(用户),essay表(随笔)以及comment表( ...
- 脑波设备mindwave介绍
脑波,又称之为脑电波,是人大脑发出的电波,非常的微弱,只能通过设备来检测. 人的脑波在不同状态下,会不同,因此可以通过脑波来量化分析人的精神状态. 科学家讲脑电波分为四种,以下为详细解释(摘自百度百科 ...
- Clojure绘制UML
简单介绍 使用Clojure封装了Graphviz的使用.眼下主要实现了UML的绘制 使用 以命令模式的UML为例,演示cdraw的使用 安装Graphviz cdraw是对Graphviz的简单封装 ...
- android编译系统的makefile文件Android.mk写法如下
(1)Android.mk文件首先需要指定LOCAL_PATH变量,用于查找源文件.由于一般情况下Android.mk和需要编译的源文件在同一目录下,所以定义成如下形式:LOCAL_PATH:=$(c ...
- Android JNI开发提高篇
有关JNI的开发技术,我们继续围绕Android平台进行,JNI可以支持C或C++,从目前为止我们写过的JNI代码均为C实现的,即文件名为.C而C++的和这些有什么不同呢? Android平台上的JN ...
- 使用VS创建WebPart部件,并部署到SP(待修改)
http://www.cnblogs.com/mingmingruyuedlut/archive/2012/12/02/2789488.html
- C#面向对象编程基础-喜课堂笔记
**************[5][C#面向对象编程基础]第1讲:类与对象**************** *************2.1.1_类与对象的概念**** ...