4.1 Orthogonal Vectors and Suspaces

  • Orthogonal vectors have \(v^Tw=0\),and \(||v||^2 + ||w||^2 = ||v+w||^2 = ||v-w||^2\).

  • Subspaces \(V\) and \(W\) are orthogonal when \(v^Tw = 0\) for every \(v\) in V and every \(w\) in W.

  • Four Subspaces Relations:

    • Every vector x in the nullspace is perpendicular to every row of A, because \(Ax=0\).The nullspace N(A) and the row space \(C(A^T)\) are orthogonal subspaces of \(R^n\). \(N(A) \bot C(A^T)\)

      \[Ax = \left[ \begin{matrix} &row_1& \\ &\vdots \\ &row_m \end{matrix} \right]
      \left[ \begin{matrix} && \\ &x \\ && \end{matrix} \right] = \left[ \begin{matrix} &0& \\ &\vdots \\ &0 \end{matrix} \right]
      \]
    • Every vector y in the nullspace of \(A^T\) is perpendicular to every column of A, because \(A^Ty=0\).The left nullspace \(N(A^T)\) and the column space \(C(A)\) are orthogonal subspaces of \(R^m\). \(N(A^T) \bot C(A)\)

      \[A^Ty = \left[ \begin{matrix} &(column_1)^T& \\ &\vdots \\ &(column_m)^T \end{matrix} \right]
      \left[ \begin{matrix} && \\ &y \\ && \end{matrix} \right] = \left[ \begin{matrix} &0& \\ &\vdots \\ &0 \end{matrix} \right]
      \]
    • The nullspace N(A) and the row space \(C(A^T)\) are orthogonal complements, with dimensions (n-r) + r = n. Similarly \(N(A^T)\) and C(A) are orthogonal complements with (m-r) + r = m.

4.2 Projection

When b is not in the columns space C(A),then Ax = b is unsolvable,so find the projection p of b onto C(A),make \(A\hat{x} = p\) is solvable.

Three Steps:

  1. Find \(\hat{x}\)
  2. Find p
  3. Find project matrix P

Projection Onto a Line

A line goes through the origin in the direction of \(a=(a_1, a_2,...,a_m)\).

Along that line, we want the point \(p\) closest to \(b=(b_1,b_2,...,b_m)\).

The key to projection is orthogonality : The line from b to p is perpendicular to the vector a.

Projecting b onto a with error \(e=b-\hat{x}a\) :

\[a \cdot(b-\hat{x}a) = a^T(b-\hat{x}a)=0 \\
\Downarrow \\
\hat{x} = \frac{a \cdot b}{a \cdot a} = \frac{a^T b}{a^T a} \\
\Downarrow \\
p = a \hat{x} =a\frac{a^T b}{a^T a} = Pb \\
\Downarrow \\
P = \frac{aa^T}{a^Ta} \\
\]

\(P=P^T\) (P is symmetric) , \(P^2=P\)

Projection Onto a Subspace

Start with n independent vectors \(A=(a_1,...,a_n)\) in \(R^m\), find the combination \(p=\hat{x}a_1 + \hat{x}a_2 +...+\hat{x_n}a_n\) closest to a givent vector b.

\[A^T(b-A\hat{x}) = \left[ \begin{matrix} -&a_1^T&- \\ &\vdots \\ -&a_n^T&- \end{matrix} \right]
\left[ \begin{matrix} && \\ &b-A\hat{x} \\ && \end{matrix} \right]
= \left[ \begin{matrix} &0& \\ &\vdots \\ &0 \end{matrix} \right] \\
\Downarrow \\
A^T(b-A\hat{x}) = 0 \ \ 或 \ \ \ A^TA\hat{x} = A^Tb \\
\Downarrow A^TA \ \ invertible\\
p=A\hat{x} = A(A^TA)^{-1}A^Tb \\
\Downarrow \\
P=A(A^TA)^{-1}A^T
\]

\(P=P^T\) (P is symmetric) , \(P^2=P\)

4.3 Least Squares Approximations

When b is outside the column space of A, that \(Ax=b\) has no solution. Find the length of \(e=b-Ax\) as small as possible, get a \(\hat{x}\) is a least squares solution.

If A has independent columns, then \(A^TA\) is invertible.

proof :

\[A^TAx = 0 \\
\Downarrow \\
x^TA^TAx = 0 \\
\Downarrow \\
(Ax)^TAx =0 \\
\Downarrow \\
Ax =0 \\
\ A \ \ is \ \ independent \\
\Downarrow \\
x=0
\]

so \(N(A^TA)\) has only zero vector, \(A^TA\) is invertible.

Linear algebra method (projection method)

Sovling \(A^TA\hat{x} = A^Tb\) gives the projection \(p=A\hat{x}\) of b onto the column space of A.

\[Ax=b \ \ (unsolvable) \\
b = p+e \\
\Downarrow \\
A\hat{x} = p \ \ (solvable) \\
\Downarrow \\
\hat{x} = (A^TA)^{-1}A^Tb
\]

example: \(A = \left[ \begin{matrix} 1&1 \\ 1&2 \\ 1&3 \end{matrix} \right], x= \left[ \begin{matrix} C \\ D \end{matrix} \right], b=\left[ \begin{matrix} 1 \\ 2 \\ 2 \end{matrix} \right], Ax=b\) is unsolvable.

\[A^TA\hat{x} = A^Tb \\
\Downarrow \\
A^TA = \left[ \begin{matrix} 1&1&1 \\ 1&2&3 \end{matrix} \right] \left[ \begin{matrix} 1&1 \\ 1&2 \\ 1&3 \end{matrix} \right] = \left[ \begin{matrix} 3&6 \\ 6&14 \end{matrix} \right] \\
A^Tb = \left[ \begin{matrix} 1&1&1 \\ 1&2&3 \end{matrix} \right] \left[ \begin{matrix} 1 \\ 2 \\ 2 \end{matrix} \right] = \left[ \begin{matrix} 5 \\ 11 \end{matrix} \right] \\
\Downarrow \\
\left[ \begin{matrix} 3&6 \\ 6&14 \end{matrix} \right] \left[ \begin{matrix} C \\ D\\ \end{matrix} \right] = \left[ \begin{matrix} 5 \\ 11 \end{matrix} \right] \\
\Downarrow \\
C = 2/3 \ , \ D=1/2 \\
\Downarrow \\
\hat{x} = \left[ \begin{matrix} 2/3 \\ 1/2\end{matrix} \right] \\
\]

Calculus method

The least squares solution \(\hat{x}\) makes the sum of errors \(E = ||Ax - b||^2\) as small as possible.

\[E = e_1^2 + e_2^2 +...+e_n^2
\]

example: \(A = \left[ \begin{matrix} 1&1 \\ 1&2 \\ 1&3 \end{matrix} \right], x= \left[ \begin{matrix} C \\ D \end{matrix} \right], b=\left[ \begin{matrix} 1 \\ 2 \\ 2 \end{matrix} \right], Ax=b\) is unsolvable.

\[E = ||Ax - b||^2 = e_1^2 + e_2^2 +e_3^2 = (C +D \cdot 1 -1)^2 + (C +D \cdot 2 -2)^2 + (C +D \cdot 3 -2)^2 \\
\Downarrow \\
\part E / \part C = 0 \\
\part E / \part D = 0 \\
\Downarrow \\
C = 2/3 \ , \ D=1/2
\]

The Big Picture for Least Squares

4.4 Orthonormal Bases and Gram-Schmidt

  • The columns \(q_1,q_2,...,q_n\) are orthonormal if \(q_i^Tq_j = \left \{ \begin{array}{rcl} 0 \ \ for \ \ i \neq j \\ 1 \ \ for \ \ i=j \end{array}\right\}, Q^TQ=I\).
  • If Q is also square, the \(QQ^T=I\) and \(Q^T=Q^{-1}\), Q is an orthogonal matrix.
  • The least squares solution to \(Qx = b\) is \(\hat{x} = Q^Tb\). Projection of b : \(p=QQ^Tb = Pb\).

Projection Using Orthonormal Bases: Q Repalces A

Orthogonal matrices are excellent and stable for computations.

Suppose the basis vectors are actually orthonormal, then \(A^TA\) simplifies to \(Q^TQ\), here is \(\hat{x} = Q^Tb\) and \(p=Q\hat{x}=QQ^Tb\).

\[\\
\
p = \left[ \begin{matrix} |&&| \\ q_1&\cdots&q_n \\ |&&| \end{matrix} \right] \left[ \begin{matrix} q_1^Tb \\ \vdots \\ q_n^Tb \end{matrix} \right] = q_1(q_1^Tb) + ... + q_n(q_n^Tb)
\]

The Gram-Schmidt Process

"Gram-Schmidt way" is to create orthonormal vectors.Take independent column vecotrs \(a_i\) to orthonormal \(q_i\).

From independent vectors \(a_1,...,a_n\),Gram-Schmidt constructs orthonormal vectors \(q_1,...,q_n\).The matrices with these columns satisfy \(A=QR\). The \(R=Q^TA\) is upper triangular because later q's are orthoganal to a's.

Gram-Schmidt Process:

Subtract from every new vector its projections in the directions already set.

  • First step : Chosing A=a and \(B = b-\frac{A^Tb}{A^TA}A\) -----> $A \bot B $
  • Second step : \(C = c-\frac{A^Tc}{A^TA}A - \frac{B^Tc}{B^TB}B\) -----> $C \bot A \ \ C \bot B $
  • Last step (orthonormal): \(q_1 = \frac{A}{||A||},q_2 = \frac{B}{||B||},q_3 = \frac{C}{||C||}\)

The \(q_1,q_2,q_3\) is the orthgonormal base. \(Q=\left[ \begin{matrix} && \\ q_1&q_2&q_3 \\ && \end{matrix} \right]\)

**The Factorization M = QR **:

Any m by n matrix M with independent columns can be factored into \(M=QR\).

\[M=QR=\left[ \begin{matrix} &&\\ q_1&q_2&q_3 \\ && \end{matrix} \right]
\left[ \begin{matrix} q_1^Ta&q_1^Tb&q_1^Tc \\ &q_2^Tb&q_2^Tc \\ &&q_3^Tc \end{matrix} \right]
\]

example:

Suppose the independent not-orthogonal vectors \(a,b,c\) are

\[a = \left[ \begin{matrix} 1 \\ -1 \\ 0\end{matrix} \right] \ and \ \
b = \left[ \begin{matrix} 2 \\ 0 \\ -2\end{matrix} \right] \ and \ \
c = \left[ \begin{matrix} 3 \\ -3 \\ 3\end{matrix} \right]
\]

Then A=a has \(A^TA=2, A^Tb=2\)

First step : \(B=b-\frac{A^Tb}{A^TA}A = \left[ \begin{matrix} 1 \\ 1 \\ -2\end{matrix} \right]\)

Second step : \(C=c-\frac{A^Tc}{A^TA}A -\frac{B^Tc}{B^TB}B = \left[ \begin{matrix} 1 \\ 1 \\ 1\end{matrix} \right]\)

Last step : \(q_1 = \frac{A}{||A||}= \frac{1}{\sqrt{2}}\left[ \begin{matrix} 1 \\ -1 \\ 0\end{matrix} \right],q_2 = \frac{B}{||B||}=\frac{1}{\sqrt{6}}\left[ \begin{matrix} 1 \\ 1 \\ -2\end{matrix} \right],q_3 = \frac{C}{||C||}=\frac{1}{\sqrt{3}}\left[ \begin{matrix} 1 \\ 1 \\ 1\end{matrix} \right]\)

Factorization :

\[M =\left[ \begin{matrix} 1&2&3 \\ -1&0&-3 \\ 0&-2&3 \end{matrix} \right]
= \left[ \begin{matrix} 1/\sqrt{2}&1/\sqrt{6}&1/\sqrt{3} \\ -1/\sqrt{2}&1/\sqrt{6}&1/\sqrt{3} \\ 0&-2/\sqrt{6}&1/\sqrt{3} \end{matrix} \right]
\left[ \begin{matrix} \sqrt{2}&\sqrt{2}&\sqrt{18} \\ 0&\sqrt{6}&-\sqrt{6} \\ 0&0&\sqrt{3} \end{matrix} \right] = QR
\]

Least squares

Instead of solving \(Ax=b\), which is impossible, we finally solve \(R\hat{x}=Q^Tb\) by back substitution which is very fast.

\[A^TA\hat{x}=A^Tb \\
A^TA =(QR)^TQR = R^TQ^TQR=R^TR \\
\Downarrow \\
R^TR\hat{x} =R^TQ^Tb \quad or \quad R\hat{x} =Q^Tb \quad or \quad \hat{x} =R^{-1}Q^Tb \\
\]

4. Orthogonality的更多相关文章

  1. 【线性代数】4-1:四个正交子空间(Orthogonality of the Four Subspace)

    title: [线性代数]4-1:四个正交子空间(Orthogonality of the Four Subspace) categories: Mathematic Linear Algebra k ...

  2. <<Numerical Analysis>>笔记

    2ed,  by Timothy Sauer DEFINITION 1.3A solution is correct within p decimal places if the error is l ...

  3. php学习笔记2016.1

    基本类型    PHP是一种弱类型语言.      PHP类型检查函数   is_bool()    is_integer()  is_double()  is_string()   is_objec ...

  4. FSL - MELODIC

    Source: http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/MELODIC; https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/MELODI ...

  5. Information retrieval信息检索

    https://en.wikipedia.org/wiki/Information_retrieval 信息检索 (一种信息技术) 信息检索(Information Retrieval)是指信息按一定 ...

  6. [转]DCM Tutorial – An Introduction to Orientation Kinematics

    原地址http://www.starlino.com/dcm_tutorial.html Introduction This article is a continuation of my IMU G ...

  7. FAQ: Machine Learning: What and How

    What: 就是将统计学算法作为理论,计算机作为工具,解决问题.statistic Algorithm. How: 如何成为菜鸟一枚? http://www.quora.com/How-can-a-b ...

  8. OpenGL extension specification (from openGL.org)

    Shader read/write/atomic into UAV global memory (need manual sync) http://www.opengl.org/registry/sp ...

  9. A geometric interpretation of the covariance matrix

    A geometric interpretation of the covariance matrix Contents [hide] 1 Introduction 2 Eigendecomposit ...

  10. 知乎上关于c和c++的一场讨论_看看高手们的想法

    为什么很多开源软件都用 C,而不是 C++ 写成? 余天升 开源社区一直都不怎么待见C++,自由软件基金会创始人Richard Stallman认为C++有语法歧义,这样子没有必要.非常琐碎还会和C不 ...

随机推荐

  1. Centos8上安装python3.X

    一.更新yum源 命令:yum update 二.更新依赖环境 命令:yum install zlib-devel bzip2-devel openssl-devel ncurses-devel sq ...

  2. Bind DNS Server的基础配置

    1.访问https://192.168.3.254:10000 由于SSL证书是不安全的,我用的Firefox浏览器会阻止打开网页: 看到上述界面,先选择"高级", 然后再选择&q ...

  3. 把Customer Order的列表页面的代码,分离到组件里

    1.新增Shared文件夹,在Shared下新增OrdersListView.razor 2.在_Imports.razor文件里添加一行 3.重命名Pages/Trade目录下的OrdersList ...

  4. 【Azure 环境】微软云上主机,服务的安全更新疑问

    [问题一]微软云上的虚拟机,不论是Windows系统or Linux 系统,系统的安全补丁是由微软云平台   打上补丁进行修复,还是使用虚拟机的用户手动更新修复呢? [答]这些补丁不会由平台来直接操作 ...

  5. 【Azure Redis 缓存】Azure Reids是否可以开启慢日志(slowlog)和执行config指令

    问题描述 使用Azure Redis,是否可以开启慢日志来查看最近时间中执行比较耗时的指令呢? 同时,如何执行Redis的Config只能来修改配置呢? 根本原因 一:Azure Reids通过Red ...

  6. Redis基础_五大数据类型和常用命令

    ## 1. Redis基本介绍 1.1 传统数据存储出现的问题 海量用户 高并发 罪魁祸首--关系型数据库: 性能瓶颈:磁盘IO性能低下 扩展瓶颈:数据关系复杂,扩展性差,不便于大规模集群 解决思路 ...

  7. Java面经知识点图谱总结

    未完待续~~~

  8. moviepy 官方网址

    https://zulko.github.io/moviepy/ Gitee 说我有违规信息 醉了 { title: "moviepy", url: "https://z ...

  9. Python 动态网页Fetch/XHR爬虫——以获取NBA球员信息为例

    Python 动态网页Fetch/XHR爬虫--以获取NBA球员信息为例 动态网页抓取信息,一般利用F12开发者工具-网络-Fetch/XHR获取信息,实现难点有: 动态网页的加载方式 获取请求Url ...

  10. 【stars-one】JetBrains产品试用重置工具

    原文[stars-one]JetBrains产品试用重置工具 | Stars-One的杂货小窝 一款可重置JetBrains全家桶产品的试用时间的小工具,与其全网去找激活码,还不如每个月自己手动重置试 ...