美帝的有心人士收集了市面上的矩阵分解的差点儿全部算法和应用,因为源地址在某神奇物质之外,特转载过来,源地址

Matrix Decompositions has
a long history and generally centers around a set of known factorizations such as LU, QR, SVD and eigendecompositions. More recent
factorizations have seen the light of the day with work started with the advent of NMF, k-means and related algorithm
 [1].
However, with the advent of new methods based on random projections and convex optimization that started in part in the compressive
sensing literature
, we are seeing another surge of very diverse algorithms dedicated to many different kinds of matrix factorizations with new constraints based on rank and/or positivity and/or sparsity,… As a result of this large increase in interest,
I have decided to keep a list of them here following the success of the big
picture in compressive sensing
.

The sources for this list include the following most excellent sites: Stephen
Becker’s page
Raghunandan H. Keshavan‘ s pageNuclear
Norm and Matrix Recovery
 through SDP by Christoph HelmbergArvind
Ganesh
’s Low-Rank Matrix Recovery and Completion via Convex
Optimization
 who provide more in-depth additional information.  Additional codes were featured also on Nuit
Blanche
. The following people provided additional inputs: Olivier GriselMatthieu
Puigt
.

Most of the algorithms listed below generally rely on using the nuclear norm as a proxy to the rank functional. It
may not be optimal
. Currently, CVX ( Michael
Grant
 and Stephen  Boyd) consistently allows one to explore other
proxies for the rank functional such as thelog-det as
found by Maryam  FazellHaitham
Hindi
Stephen Boyd. ** is used to show that the algorithm uses
another heuristic than the nuclear norm.

In terms of notations, A refers to a matrix, L refers to a low rank matrix, S a sparse one and N to a noisy one. This page lists the different codes that implement the following matrix factorizations: Matrix Completion, Robust
PCA , Noisy Robust PCA, Sparse PCA, NMF, Dictionary Learning, MMV, Randomized Algorithms and other factorizations. Some of these toolboxes can sometimes implement several of these decompositions and are listed accordingly. Before I list algorithm here, I generally
feature them on Nuit Blanche under the MF tag: http://nuit-blanche.blogspot.com/search/label/MF or. you
can also subscribe to the Nuit Blanche feed,

Matrix Completion, A = H.*L with H a known mask, L unknown solve
for L lowest rank possible

The idea of this approach is to complete the unknown coefficients of a matrix based on the fact that the matrix is low rank:

Noisy Robust PCA,  A = L + S + N with L, S, N unknown, solve
for L low rank, S sparse, N noise

Robust PCA : A = L + S with L, S, N unknown, solve for L low
rank, S sparse

Sparse PCA: A = DX  with unknown D and X, solve for sparse
D

Sparse PCA on wikipedia

  • R. Jenatton, G. Obozinski, F. Bach. Structured Sparse Principal Component Analysis. International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf]
    [code]
  • SPAMs
  • DSPCA: Sparse
    PCA using SDP
     . Code ishere.
  • PathPCA: A fast greedy algorithm for Sparse PCA. The code is here.

Dictionary Learning: A = DX  with unknown D and X, solve for sparse
X

Some implementation of dictionary learning implement the NMF

NMF: A = DX with unknown D and X, solve for elements of D,X
> 0

Non-negative
Matrix Factorization (NMF) on wikipedia

Multiple Measurement Vector (MMV) Y = A X with unknown X and rows
of X are sparse.

Blind Source Separation (BSS) Y = A X with unknown A and X and
statistical independence between columns of X or subspaces of columns of X

Include Independent Component Analysis (ICA), Independent Subspace Analysis (ISA), and Sparse Component Analysis (SCA). There are many available codes for ICA and some for SCA. Here is a non-exhaustive list of some
famous ones (which are not limited to linear instantaneous mixtures). TBC

ICA:

SCA:

Randomized Algorithms

These algorithms uses generally random projections to shrink very large problems into smaller ones that can be amenable to traditional matrix factorization methods.

Resource

Randomized algorithms for matrices and data by Michael W. Mahoney

Randomized Algorithms for Low-Rank Matrix
Decomposition

Other factorization

D(T(.)) = L + E with unknown L, E and unknown transformation T and solve
for transformation T, Low Rank L and Noise E

Frameworks featuring advanced Matrix factorizations

For the time being, few have integrated the most recent factorizations.

GraphLab / Hadoop

Books

Example of use

Sources

Arvind Ganesh’s Low-Rank
Matrix Recovery and Completion via Convex Optimization

Relevant links

Reference:

A
Unified View of Matrix Factorization Models by Ajit P. Singh and Geoffrey J. Gordon

本文引用地址:http://blog.sciencenet.cn/blog-242887-483128.html

Matrix Factorization, Algorithms, Applications, and Avaliable packages的更多相关文章

  1. Matrix Factorization SVD 矩阵分解

    Today we have learned the Matrix Factorization, and I want to record my study notes. Some kownledge ...

  2. 关于NMF(Non-negative Matrix Factorization )

    著名的科学杂志<Nature>于1999年刊登了两位科学家D.D.Lee和H.S.Seung对数学中非负矩阵研究的突出成果.该文提出了一种新的矩阵分解思想――非负矩阵分解(Non-nega ...

  3. 机器学习技法:15 Matrix Factorization

    Roadmap Linear Network Hypothesis Basic Matrix Factorization Stochastic Gradient Descent Summary of ...

  4. 《Non-Negative Matrix Factorization for Polyphonic Music Transcription》译文

    NMF(非负矩阵分解),由于其分解出的矩阵是非负的,在一些实际问题中具有非常好的解释,因此用途很广.在此,我给大家介绍一下NMF在多声部音乐中的应用.要翻译的论文是利用NMF转录多声部音乐的开山之作, ...

  5. 机器学习技法笔记:15 Matrix Factorization

    Roadmap Linear Network Hypothesis Basic Matrix Factorization Stochastic Gradient Descent Summary of ...

  6. Non-negative Matrix Factorization 非负矩阵分解

    著名的科学杂志<Nature>于1999年刊登了两位科学家D.D.Lee和H.S.Seung对数学中非负矩阵研究的突出成果.该文提出了一种新的矩阵分解思想――非负矩阵分解(Non-nega ...

  7. 【RS】Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering - 基于拉普拉斯分布的稀疏概率矩阵分解协同过滤

    [论文标题]Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering  ...

  8. 【RS】List-wise learning to rank with matrix factorization for collaborative filtering - 结合列表启发排序和矩阵分解的协同过滤

    [论文标题]List-wise learning to rank with matrix factorization for collaborative filtering   (RecSys '10 ...

  9. 【RS】Matrix Factorization Techniques for Recommender Systems - 推荐系统的矩阵分解技术

    [论文标题]Matrix Factorization Techniques for Recommender Systems(2009,Published by the IEEE Computer So ...

随机推荐

  1. c# 遍历文件夹及其所有文件

    利用VS创建一个winform应用程序,遍历指定文件夹(photos)内的所有文件夹及其文件.具体程序如下: namespace 遍历文件夹及其所有文件 { public partial class ...

  2. mysql 存储过程需要DELIMITER

    DELIMITER &&CREATE PROCEDURE syncAdvertiser() BEGIN DECLARE id bigint; DECLARE _cur CURSOR F ...

  3. 第二天(CSS 选择器)

    1.常用的CSS选择器         类型选择器: 例如: p { color : red ; }         后代选择器: 例如: h2 span { font-weight : bold ; ...

  4. Java 日期字符串与日期类型转换

    1.SimpleDateFormat.format 把日期类型转化到指定格式字符串 public static String convToString(Calendar cld,String temp ...

  5. app微信支付服务器端php demo

    class Wxpay { /* 配置参数 */ private $config = array( 'appid' => "wxc92b12277f277355", /*微信 ...

  6. WORD 无格式粘贴 2003 2007 MacOS2011

    2003 打开Word窗口,依次点击“工具----宏----Visual Basic编辑器”,打开“Microsoft visual Basic”窗口,在左侧“工程”栏选中“Normal”工程,点击“ ...

  7. BZOJ 1005 明明的烦恼 (组合数学)

    题解:n为树的节点数,d[ ]为各节点的度数,m为无限制度数的节点数. 则               所以要求在n-2大小的数组中插入tot各序号,共有种插法: 在tot各序号排列中,插第一个节点的 ...

  8. centos6.5 搭建php5.5+mysql5.5+apache2.4

    本文总结了Linux下 root.常用查找命令.卸载软件方法(见二.安装PHP5.5).配置软件源(见二).安装软件(见二)与高版本替换软件(见三.安装MySQL)的方法. 迁移网站,机器上原本已有p ...

  9. VS2010/MFC设置对话框控件的Tab顺序

    设置对话框控件的Tab顺序 前面几节为大家演示了加法计算器程序完整的编写过程,本节主要讲对话框上控件的Tab顺序如何调整. 上一讲为“计算”按钮添加了消息处理函数后,加法计算器已经能够进行浮点数的加法 ...

  10. hdoj 1028 Ignatius and the Princess III(区间dp)

    题目链接:http://acm.hdu.edu.cn/showproblem.php?pid=1028 思路分析:该问题要求求出某个整数能够被划分为多少个整数之和(如 4 = 2 + 2, 4 = 2 ...