Matrix factorization

导语:承载上集的矩阵代数入门,今天来聊聊进阶版,矩阵分解。其他集数可在[线性代数]标籤文章找到。有空再弄目录什麽的。

Matrix factorization is quite like an application of invertible matrices, where L is an invertible matrix in LU factorization.

As you may have seen, that solving Ax=b for x can be tedious with all the row-reduction algorithm. Here, we are going to explore another efficient algorithm for find x in matrix equation, which is LU Factorization. Suppose we are given L and U in the following form which reconstruct A. L is an invertible unit lower triangular mxm matrix, while U is the mxn echelon form of A. Recall a way to solve for x is by x=A-1b and A-1 need to be invertible. Since L is invertible, LU is also invertible as proved in previous article in this series. The motivation here is that if we are to compute x for different b, we need to compute A-1bi for every single b. That's not desirable and we should look for ways to circumvent this…

Suppose LU are already given, expressing A=LU is just the first step in LU factorization. Remember our goal of using matrix factorization is to solve for x in matrix equation. So we rely on the following:

Above suggests by row-reducing the following, we can get x. So we introduce y as the intermediate results along our way to get b. Noted that we still need to calculate each b individually for Ax=b, just that with the assistance of LU, less steps are involved.

As we know L as an lower unit triangular matrix, columns must be linearly independent. Since it's mxm, L is also invertible. This means the following:

Indeed, when you get a lower triangular unit matrix L, it's trivial to get Imxm from it. As U is the echelon form of A and is of size mxn, so identity matrix is not guaranteed as the reduced echelon form may not be of square matrix.

The LU factorization algorithm

The prerequisite for using this algorithm is that, given any matrix A in Ax=b, A must be reduceable to echelon form, U, using row replacements of rows in a TOP-DOWN manner. However, this is always a hard requirement to meet and people sometimes relax this restriction into allowing row interchanges before performing top-down sequential row replacement in A. If the requirements are satisfied, it's guaranteed we can get the lower triangular unit matrix L, and the proof of which is shown below:

And if we apply the same sequence of elementary matrices onto L, we restore the identity matrix I as follows:

But now it sounds a bit abstract. What exactly does give us btw? And how is it utilized to find L? The following example shows how. During the row-reduction of A into U, entries below pivot position in each pivot column is zeroed-out. The reverse of elementary row operations just require us to gather all pivot columns before their transformation and pack them into a nxn matrix.

When all pivot columns before row replacement gathered, L is easily available.

Examples

例子不定期更新

[线性代数] 矩阵代数進階:矩阵分解 Matrix factorization的更多相关文章

  1. 矩阵分解(Matrix Factorization)与推荐系统

    转自:http://www.tuicool.com/articles/RV3m6n 对于矩阵分解的梯度下降推导参考如下:

  2. 【Math for ML】矩阵分解(Matrix Decompositions) (下)

    [Math for ML]矩阵分解(Matrix Decompositions) (上) I. 奇异值分解(Singular Value Decomposition) 1. 定义 Singular V ...

  3. 【Math for ML】矩阵分解(Matrix Decompositions) (上)

    I. 行列式(Determinants)和迹(Trace) 1. 行列式(Determinants) 为避免和绝对值符号混淆,本文一般使用\(det(A)\)来表示矩阵\(A\)的行列式.另外这里的\ ...

  4. 推荐系统之矩阵分解及C++实现

    1.引言 矩阵分解(Matrix Factorization, MF)是传统推荐系统最为经典的算法,思想来源于数学中的奇异值分解(SVD), 但是与SVD 还是有些不同,形式就可以看出SVD将原始的评 ...

  5. Mahout分布式运行实例:基于矩阵分解的协同过滤评分系统(一个命令实现文件格式的转换)

     Apr 08, 2014  Categories in tutorial tagged with Mahout hadoop 协同过滤  Joe Jiang 前言:之前配置Mahout时测试过一个简 ...

  6. matlab之矩阵分解

    矩阵分解 矩阵分解 (decomposition, factorization)是将矩阵拆解为数个矩阵的乘积. 1.三角分解法: 要求原矩阵为方阵,将之分解成一个上三角形矩阵(或是排列(permute ...

  7. Matrix Factorization SVD 矩阵分解

    Today we have learned the Matrix Factorization, and I want to record my study notes. Some kownledge ...

  8. 【RS】Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering - 基于拉普拉斯分布的稀疏概率矩阵分解协同过滤

    [论文标题]Sparse Probabilistic Matrix Factorization by Laplace Distribution for Collaborative Filtering  ...

  9. 【RS】List-wise learning to rank with matrix factorization for collaborative filtering - 结合列表启发排序和矩阵分解的协同过滤

    [论文标题]List-wise learning to rank with matrix factorization for collaborative filtering   (RecSys '10 ...

随机推荐

  1. Unity_如何判断应用设备内存小于1G(内容可定制为根据机器配置进行不同LOD)

    直接上脚本,需要用的时候在需要的地方调用就好. 如: //获取设别的最大内存,作为判断LOD等级和决定1G以下设备不能进游戏 #if UNITY_ANDROID && !UNITY_E ...

  2. windows下搭建nginx负载均衡

    学习笔记,第一次记录避免忘记 首先介绍一下本地环境是windows2008 R2-64位. 1.  到nginx官网上下载最新稳定版的安装包,http://nginx.org/en/download. ...

  3. day14-python之集合函数字符串格式化

    1.集合 #!/usr/bin/env python # -*- coding:utf-8 -*- # s=set(['alex','alex','sb']) # print(s) # s=set(' ...

  4. shell 三剑客之 sed 命令详解

    sed 编辑命令 sed 编辑命令对照表 把 /etc/passwd 文件赋值到当前路径下,进行操作 cp /etc/passwd ./ cat -n passwd sed 删除操作 删除 passw ...

  5. MVC-路由(Route)

    1.启用路由前的准备工作 Global.asax.cs中注册路由 public class MvcApplication : System.Web.HttpApplication { protecte ...

  6. rabbitMQ centos7 的安装

    安装erlang 1:下载erlang. http://erlang.org/download/otp_src_20.3.tar.gz 2:把erlang压缩包上传到Linux服务器上,并解压.我的解 ...

  7. Keras实现Hierarchical Attention Network时的一些坑

    Reshape 对于的张量x,x.shape=(a, b, c, d)的情况 若调用keras.layer.Reshape(target_shape=(-1, c, d)), 处理后的张量形状为(?, ...

  8. 记录java+testng运行selenium(四)--- 结构说明

    一图:主要是driver文件所在目录,及ini配置文件所在位置. 这两个文件一般我是放在其它目录下,不跟随项目所在目录 二图:用例操作类及用例执行类所在位置. 下图中有接口代码及功能代码组成,之前的文 ...

  9. git命令——revert、reset

    参考:如何在 Git 中重置.恢复,返回到以前的状态 使用git时,如果对刚刚提交的后悔了怎么办,如何撤销? 方法一:手动修改 你把新增的文件删了 或者 更改过的文件再改回来,然后再commit一次. ...

  10. Python将字符串转换成字典

    1. ast包 import ast user_info = '{"name" : "南湖", "gender" : "male& ...