7.1 Singular values and Singular vectors

The SVD separates any matrix into simple pieces.

A is any m by n matrix, square or rectangular, Its rank is r.

Choices from the SVD

\[AA^Tu_i = \sigma_i^{2}u_i \\
A^TAv_i = \sigma_i^{2}v_i \\
Av_i = \sigma_i u_i
\]

\(u_i\)— the left singular vectors (unit eigenvectors of \(AA^T\))

\(v_i\)— the right singular vectors (unit eigenvectors of \(A^TA\))

\(\sigma_i\)— singular values (square roots of the equal eigenvalues of \(AA^T\) and \(A^TA\))

The rank of A is equal to numbers of \(\sigma _i\)

example:

\[A = \left [ \begin{matrix} 1&0 \\ 1&1 \end{matrix}\right] \\
\Downarrow \\
AA^T =
\left [ \begin{matrix} 1&0 \\ 1&1 \end{matrix}\right]
\left [ \begin{matrix} 1&1 \\ 0&1 \end{matrix}\right]
=\left [ \begin{matrix} 1&1 \\ 1&2 \end{matrix}\right]
\\
A^TA =
\left [ \begin{matrix} 1&1 \\ 0&1 \end{matrix}\right]
\left [ \begin{matrix} 1&0 \\ 1&1 \end{matrix}\right]
=\left [ \begin{matrix} 2&1 \\ 1&1 \end{matrix}\right]
\\
\Downarrow \\
det(AA^T - I) = 0 \ \quad \ det(A^TA - I) = 0 \\
\lambda_1 = \frac{3+\sqrt{5}}{2} , \sigma_1=\frac{1+\sqrt{5}}{2},
u_1= \frac{1}{\sqrt{1+\sigma_1^2}}\left [ \begin{matrix} 1 \\ \sigma_1 \end{matrix}\right],
v_1= \frac{1}{\sqrt{1+\sigma_1^2}}\left [ \begin{matrix} \sigma_1 \\ 1 \end{matrix}\right]
\\
\lambda_2 = \frac{3-\sqrt{5}}{2} , \sigma_1=\frac{1-\sqrt{5}}{2},
u_2= \frac{1}{\sqrt{1+\sigma_2^2}}\left [ \begin{matrix} \sigma_1 \\ -1 \end{matrix}\right],
v_2= \frac{1}{\sqrt{1+\sigma_2^2}}\left [ \begin{matrix} 1 \\ -\sigma_1 \end{matrix}\right]\\
\Downarrow \\
A =
\left [ \begin{matrix} u_1&u_2 \end{matrix}\right]
\left [ \begin{matrix} \sigma_1&\\&\sigma_2 \end{matrix}\right]
\left [ \begin{matrix} v_1^T\\v_2^T \end{matrix}\right]
\\
A\left [ \begin{matrix} v_1&v_2 \end{matrix}\right] =
\left [ \begin{matrix} u_1&u_2 \end{matrix}\right]
\left [ \begin{matrix} \sigma_1&\\&\sigma_2 \end{matrix}\right]
\]

7.2 Bases and Matrices in the SVD

Keys:

  1. The SVD produces orthonormal basis of \(u's\) and $ v's$ for the four fundamental subspaces.

    • \(u_1,u_2,...,u_r\) is an orthonormal basis of the column space. (\(R^m\))
    • \(u_{r+1},...,u_{m}\) is an orthonormal basis of the left nullspace. (\(R^m\))
    • \(v_1,v_2,...,v_r\) is an orthonormal basis of the row space. (\(R^n\))
    • \(v_{r+1},...,u_{n}\) is an orthonormal basis of the nullspace.(\(R^n\))
  2. Using those basis, A can be diagonalized :

    Reduced SVD: only with bases for the row space and column space.

    \[A = U_r \Sigma_r V_r^T \\
    U = \left [ \begin{matrix} u_1&\cdots&u_r\\ \end{matrix}\right] ,
    \Sigma_r = \left [ \begin{matrix} \sigma_1&&\\&\ddots&\\&&\sigma_r \end{matrix}\right],
    V_r^T=\left [ \begin{matrix} v_1\\ \vdots \\ v_r \end{matrix}\right] \\
    \Downarrow \\
    A = \left [ \begin{matrix} u_1&\cdots&u_r\\ \end{matrix}\right]
    \left [ \begin{matrix} \sigma_1&&\\&\ddots&\\&&\sigma_r \end{matrix}\right]
    \left [ \begin{matrix} v_1\\ \vdots \\ v_r \end{matrix}\right] \\
    = u_1\sigma_1v_{1}^T + u_2\sigma_2v_{2}^T + \cdots + u_r\sigma_rv_r^T
    \]

    Full SVD: include four subspaces.

    \[A = U \Sigma V^T \\
    U = \left [ \begin{matrix} u_1&\cdots&u_r&\cdots&u_n\\ \end{matrix}\right] ,
    \Sigma_r = \left [ \begin{matrix} \sigma_1&&\\&\ddots&\\&&\sigma_r \\ &&&\ddots \\ &&&&\sigma_n \end{matrix}\right],
    V^T=\left [ \begin{matrix} v_1\\ \vdots \\ v_r \\ \vdots \\ v_m \end{matrix}\right] \\
    \Downarrow \\
    A = \left [ \begin{matrix} u_1&\cdots&u_r&\cdots&u_n\\ \end{matrix}\right]
    \left [ \begin{matrix} \sigma_1&&\\&\ddots&\\&&\sigma_r \\ &&&\ddots \\ &&&&\sigma_n \end{matrix}\right]
    \left [ \begin{matrix} v_1\\ \vdots \\ v_r \\ \vdots \\ v_m \end{matrix}\right] \\
    = u_1\sigma_1v_{1}^T + u_2\sigma_2v_{2}^T + \cdots + u_r\sigma_rv_r^T\cdots + u_n\sigma_n v_n^{T} + \cdots + u_m\sigma_mv_m^T
    \]

    example: \(A=\left [ \begin{matrix} 3&0 \\ 4&5 \end{matrix}\right]\), r=2

    \[A^TA =\left [ \begin{matrix} 25&20 \\ 20&25 \end{matrix}\right],
    AA^T =\left [ \begin{matrix} 9&12 \\ 12&41 \end{matrix}\right] \\
    \lambda_1 = 45, \sigma_1 = \sqrt{45},
    v_1 = \frac{1}{\sqrt{2}}
    \left [ \begin{matrix} 1 \\ 1 \end{matrix}\right],
    u_1 = \frac{1}{\sqrt{10}}
    \left [ \begin{matrix} 1 \\ 3 \end{matrix}\right]\\
    \lambda_2 = 5, \sigma_2 = \sqrt{5} ,
    v_2 = \frac{1}{\sqrt{2}}
    \left [ \begin{matrix} -1 \\ 1 \end{matrix}\right],
    u_2 = \frac{1}{\sqrt{10}}
    \left [ \begin{matrix} -3 \\ 1 \end{matrix}\right]\\
    \Downarrow \\
    U = \frac{1}{\sqrt{10}}
    \left [ \begin{matrix} 1&-3 \\ 3&1 \end{matrix}\right],
    \Sigma = \left [ \begin{matrix} \sqrt{45}& \\ &\sqrt{5} \end{matrix}\right],
    V = \frac{1}{\sqrt{2}}
    \left [ \begin{matrix} 1&-1 \\ 1&1 \end{matrix}\right]
    \]

7.3 The geometry of the SVD

  1. \(A = U\Sigma V^T\) factors into (rotation)(stretching)(rotation), the geometry shows how A transforms vectors x on a circle to vectors Ax on an ellipse.

  1. Polar decomposition factors A into QS : rotation \(Q=UV^T\) times streching \(S=V \Sigma V^T\).

    \[V^TV = I \\
    A = U\Sigma V^T = (UV^T)(V\Sigma V^T) = (Q)(S)
    \]

    Q is orthogonal and inclues both rotations U and \(V^T\), S is symmetric positive semidefinite and gives the stretching directions.

    If A is invertible, S is positive definite.

  2. The Pseudoinverse \(A^{+}: AA^{+}=I\)

    • \(Av_i=\sigma_iu_i\) : A multiplies \(v_i\) in the row space of A to give \(\sigma_i u_i\) in the column space of A.

    • If \(A^{-1}\) exists, \(A^{-1}u_i=\frac{v_i}{\sigma}\) : \(A^{-1}\) multiplies \(u_i\) in the row space of \(A^{-1}\) to give \(\sigma_i u_i\) in the column space of \(A^{-1}\), \(1/\sigma_i\) is singular values of \(A^{-1}\).

    • Pseudoinverse of A: if \(A^{-1}\) exists, then \(A^{+}\) is the same as \(A^{-1}\)

      \[A^{+} = V \Sigma^{+}U^{T} = \left [ \begin{matrix} v_1&\cdots&v_r&\cdots&v_n\\ \end{matrix}\right]
      \left [ \begin{matrix} \sigma_1^{-1}&&\\&\ddots&\\&&\sigma_r^{-1} \\ &&&\ddots \\ &&&&\sigma_n^{-1} \end{matrix}\right]
      \left [ \begin{matrix} u_1\\ \vdots \\ u_r \\ \vdots \\ u_m \end{matrix}\right] \\
      \]

7.4 Principal Component Analysis ( PCA by the SVD)

PCA gives a way to understand a data plot in dimension m, applications mostly are human genetics \ face recognition\ finance \ model order reduction (computation) .

The sample covariance matrix \(S=AA^T/(n-1)\)

The crucial connection to linear algebra is in the singular values and singular vectors of A, which comes from the eigenvalues \(\lambda=\sigma^2\) and the eigenvectors u of the sample covariance matrix \(S=AA^T/(n-1)\)

  1. The total variance in the data is the sum of all eigenvalues and of sample variances \(s^2\) :

    \[T = \sigma_1^2 + \cdots + \sigma_m^2 = s_1^2 + \cdots + s_m^2 = trace(diagonal \ \ sum)
    \]
  2. The first eigenvector \(u_1\) of S points in the most significant direction of the data.That direction accounts for a fraction \(\sigma_1^2/T\) of the total variance.

  3. The next eigenvectors \(u_2\) (orthogonal to \(u_1\)) accounts for a small fraction \(\sigma_2^2/T\).

  4. Stop when those fractions are small. You have the R directions that explain most of the data.The n data points are very near an R-dimensional subspace with basis \(u_1, \cdots, u_R\), which are the principal components.

  5. R is the "effective rank" of A. The true rank r is probably m or n : full rank matrix.

example: \(A = \left[ \begin{matrix} 3&-4&7&-1&-4&-3 \\ 7&-6&8&-1&-1&-7 \end{matrix} \right]\) has sample covariance \(S=AA^T/5 = \left [ \begin{matrix} 20&25 \\ 25&40 \end{matrix}\right]\)

The eigenvalues of S are 57 and 3,so the first rank one piece \(\sqrt{57}u_1v_1^T\) is much larger than the second piece \(\sqrt{3}u_2v_2^T\).

The leading eigenvector \(u_1 = (0.6,0.8)\) shows the direction that you see in the scatter graph.

The SVD of A (centered data) shows the dominant direction in the scatter plot.

The second eigenvector \(u_2\) is perpendicular to \(u_1\). The second singular value \(\sigma_2=\sqrt{3}\) measures the spread across the dominant line.

7. The Singular Value Decomposition(SVD)的更多相关文章

  1. [Math Review] Linear Algebra for Singular Value Decomposition (SVD)

    Matrix and Determinant Let C be an M × N matrix with real-valued entries, i.e. C={cij}mxn Determinan ...

  2. SVD singular value decomposition

    SVD singular value decomposition https://en.wikipedia.org/wiki/Singular_value_decomposition 奇异值分解在统计 ...

  3. 奇异值分解(We Recommend a Singular Value Decomposition)

    奇异值分解(We Recommend a Singular Value Decomposition) 原文作者:David Austin原文链接: http://www.ams.org/samplin ...

  4. We Recommend a Singular Value Decomposition

    We Recommend a Singular Value Decomposition Introduction The topic of this article, the singular val ...

  5. 【转】奇异值分解(We Recommend a Singular Value Decomposition)

    文章转自:奇异值分解(We Recommend a Singular Value Decomposition) 文章写的浅显易懂,很有意思.但是没找到转载方式,所以复制了过来.一个是备忘,一个是分享给 ...

  6. [转]奇异值分解(We Recommend a Singular Value Decomposition)

    原文作者:David Austin原文链接: http://www.ams.org/samplings/feature-column/fcarc-svd译者:richardsun(孙振龙) 在这篇文章 ...

  7. [转载]We Recommend a Singular Value Decomposition

    原文:http://www.ams.org/samplings/feature-column/fcarc-svd Introduction The topic of this article, the ...

  8. Singular value decomposition

    SVD is a factorization of a real or complex matrix. It has many useful applications in signal proces ...

  9. 关于SVD(Singular Value Decomposition)的那些事儿

    SVD简介 SVD不仅是一个数学问题,在机器学习领域,有相当多的应用与奇异值都可以扯上关系,比如做feature reduction的PCA,做数据压缩(以图像压缩为代表)的算法,还有做搜索引擎语义层 ...

  10. 从矩阵(matrix)角度讨论PCA(Principal Component Analysis 主成分分析)、SVD(Singular Value Decomposition 奇异值分解)相关原理

    0. 引言 本文主要的目的在于讨论PAC降维和SVD特征提取原理,围绕这一主题,在文章的开头从涉及的相关矩阵原理切入,逐步深入讨论,希望能够学习这一领域问题的读者朋友有帮助. 这里推荐Mit的Gilb ...

随机推荐

  1. 逆向实战32——某东最新h5st4.4算法分析

    前言 本文章中所有内容仅供学习交流,抓包内容.敏感网址.数据接口均已做脱敏处理,严禁用于商业用途和非法用途,否则由此产生的一切后果均与作者无关,若有侵权,请联系我立即删除! 目标网站 aHR0cHM6 ...

  2. 【Azure Developer】Python 获取 Azure 中订阅(subscription)信息,包含ID, Name等

    问题描述 在Azure AD中注册一个Applicaiton后,对其进行授权,能够查看所有订阅的ReadOnly权限,然后,是否可以同通过Python代码,在完成Authorization后(使用Cl ...

  3. 图数据库实操:用 Nebula Graph 破解成语版 Wordle 谜底

    本文首发于 Nebula Graph Community 公众号 春节期间如果有小伙伴玩过 Wordle 这个火爆社交媒体的猜词游戏,可能对成语版本的汉兜有所耳闻.在玩汉兜过程中,我发现用 Nebul ...

  4. MVVM框架模式

    MVC框架模式 MVP框架模式 MVVM框架模式 MVVM模式即: 1.Model:数据层.网络数据操作,file文件操作,本地数据库操作: 2.View:视图层.布局加载,ui交互. 3.ViewM ...

  5. 这波操作看麻了!十亿行数据,从71s到1.7s的优化之路。

    你好呀,我是歪歪. 春节期间关注到了一个关于 Java 方面的比赛,很有意思.由于是开源的,我把项目拉下来试图学(白)习(嫖)别人的做题思路,在这期间一度让我产生了一个自我怀疑: 他们写的 Java ...

  6. Java-Script 编程

    Java-Script 编程 目录 Java-Script 编程 一. Js概念 1.1 简介 1.2 语法结构 二. 变量使用 2.1 定义变量 2.2 定义常量 三. 数据类型 3.1 数值类型( ...

  7. Failed to execute goal on project WebBackend: Could not resolve dependencies for project com.lang.yi:WebBackend:jar:1.0.0

    一.问题由来 自己在搭建项目的时候报一个错误,如标题所示,具体错误信息如下: Failed to execute goal on project WebBackend: Could not resol ...

  8. 分布式理论 & RPC & Dubbo

    分布式服务框架(RPC) 用于提高机器利用率的资源调度和治理中心*(SOA)[ Service Oriented Architecture] Dubbo(RPC框架) 服务提供者**(Provider ...

  9. day01-SpringBoot基本介绍

    SpringBoot基本介绍 1.SpringBoot是什么? 官网地址:https://spring.io/projects/spring-boot 学习文档:https://docs.spring ...

  10. 安装YCM

    安装Vundle git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim Vundle也是vim的 ...