1. Chap 1: Linear Equations and Matrix

    1. Linear equations
    2. Gaussian elimination
      • Pivot;
      • Triangularize;
      • Back substitution;
      • Coefficient matrix, augmented matrix, row vector & column vector;
      • the meaning of Ai*, A*j;
      • 3 situations of solution existence (under the view of linear equations): 0,1 or infinite;
      • Computational complexity: n^3/3+...;
    3. Gaussian-Jordan Method
      • Computational complexity: n^3/2+...;
    4. Roundoff error
      • Form of floating number: f=± .d1 d2 ... dt * b^n (d1≠0);
      • Roundoff error: caused by the different magnitudes between the different columns;
      • Partial pivoting: search the position BELOW the pivotal position for the coefficient in maximum magnitude;
      • Complete pivoting: search the position BELOW and on the RIGHT of the pivotal position for the max coefficient;
      • Partial & Complete pivoting: whether using elementary column operation. The partial one is used more frequently because the elementary column operation is not easy to use;
    5. the ill-conditioned system
      • the solution of an ill-conditioned system is extremely sensitive to a small perturbation on the coefficients;
      • Geometrical view: two linear systems are almost parallel so that their cross point will move sensitively when any one system moved;
      • How to notice the ill-condition of a linear system: enumerating ( it's not easy to find whether a system is ill-conditioned);
      • 2 way to solve the problem: bite the bullet and compute the accurate solution, or redesign the experiment setup to avoid producing the ill-conditioned system. The latter one is better empirically. Finding a system is an ill-conditioned one as early as possible will save much time;
    6. Row echelon form
      • Notation: E;
      • Cause: linear correlation between different column vectors and modified Gaussian elimination;
      • The echelon form (namely the position of pivots) is uniquely determined by the entries in A. However, the entries in E is not uniquely determined by A.
      • Basic column: the columns in A which contain the pivotal position;
      • Rank: the number of pivots = the number of nonzero rows in E = the number of basic columns in A;
      • Reduced row echelon form: produced by Gaussian-Jordan Method( [0 0 1 0]T ), notated by EA;
      • Both form and entries of EA is uniquely determined by A;
      • EA can show the hidden relationships among the different columns of A;
    7. Consistency of linear system
      • A system is consistent if it has at least one solution. Otherwise, it is inconsistent.
      • When n (the number of equations) is two or three, the consistency of the system can be shown geometrically, the common point.
      • If  n>3, we can judge through the following method:
        • In the augmented matrix [A|b], 0=a≠0 does not exist;
        • In [A|b], b is the nonbasic column;
        • rank(A|b)=rank(A);
        • b is the combination of the basic column in A.
    8. Homogeneous system
      • Homogeneous and nonhomogeneous;
      • Trivial solution;
      • A homogeneous system must be a consistent system;
      • General solution: basic variable, free variable;
    9. Nonhomogeneous system
      • General solution;
      • The system possesses a unique solution if and only if:
        • rank(A) = the number of the unknowns;
        • no free variable;
        • the associated homogeneous system only has a trivial solution;
  2. Chap 2: Matrix Algebra

    1. Addition

      • Addition and addition inversion;
      • Addition properties;
    2. Scalar multiplication
    3. Transpose
      • Transpose and conjugate transpose;
      • Properties;
      • Symmetry;
        • Symmetric matrix, skew-symmetric matrix, hermitian matrix, skew-hermitian matrix;
    4. Multiplication
      • Linear function: f(x1+x2)=f(x1)+f(x2), f(kx)=kf(x) <=> f(kx+y)=kf(x)+f(y);
      • Affine function: translation of linear function;
      • Matrix multiplication;
      • Properties: distributive law(left one or tight one) and associative law, but no commutative law;
      • Trace
        • Definition: the sum of diagonal entries;
        • Properties: trace(AB) = trace(BA), trace(ABC) = trace(BCA) = trace(CAB) ≠ trace(ACB);
      • Meaning of rows and columns in a product
        • [AB]i* = linear combination of row vectors in B based on i-th row vector in A;
        • [AB]*j = linear combination of column vectors in A based on j-th column vector in B;
        • column vector * row vector = a matrix whose rank is 1 ( outer product);
        • row vector * column vector <=> inner product;
      • Identity matrix;
      • Power: nonnegative;
      • Block matrix multiplication;
    5. Inversion
      • Only square matrices have matrix inversion;
      • AB=I and BA=I ( When only square matrix involved, any one of the two equations is sufficing);
      • Nonsingular matrix and singular matrix;
      • When an inversion exists, it is unique. That means:
        • If A is nonsingular, the equation Ax=b has the unique solution x=A'b;
        • If A is nonsingular, rank(A) =n (full rank);
        • If A is nonsingular, the unknown x has no free variable;
        • If A is nonsingular, the associated homogeneous system has a trivial solution only;
      • Existence of matrix inversion: A' exists <=> rank(A)=n <=> A can be transformed to I via Gauss-Jordan Method <=> Ax=0 only has a trivial solution;
      • Computing an inversion: transforming [A|I] to [I|A'] via Gauss-Jordan Method;
      • Complexity(x=A'b) > Complexity(Gaussian Elimination):
        • C(GE) ≈ n^3/3;
        • C(x=A'b) = C(computing A') + C(A'b) ≈ 2*(n^3/2) + n*n*n = 2n^3;
      • Properties:
        • (A')' = A;
        • A, B are nonsingular, AB is also nonsingular;
        • (AB)' = B'A';
        • (A')T = (AT)' as well as (A')* = (A*)';
      • Inversion of sum and sensitivity:
        • Directly discuss the relationship of (A+B)' and A', B' is meaningless;
        • Sherman-Morrison Formula: a small perturbation;
        • Neumann Series:
          • If limn->infiniteAn=0 and (I-A) is nonsingular, (I-A)'=I + A + A2 + ... =ΣiAi;
          • To solve (A+B)', the expression can be transformed into A(I-(-A'B))';
          • (A+B)' ≈ A' + A'BA': A perturbation B on A, will make inversion change by A'BA'. When A' is large, a small perturbation will change the result a lot;
        • Condition number;
    6. Elementary Matrices and Equivalence
      • Elementary matrix: I-uv^T, u and v are column vectors;

        • The inversion of an elementary matrix is also an elementary matrix;
        • Elementary matrices associated with three types of elementary row (or column) operation;
        • A is a nonsingular matrix <=> A is the product of elementary matrices of Type I, II and III row (or column) operation;
      • Equivalence: A~B <=> PAQ=B for nonsingular P and Q;
        • Row equivalence and column equivalence;
        • Rank normal form: if A is an m*n  matrix such that rank(A)=r, then A~Nr=[[Ir, 0]^T, [0, 0]^T], Nr is called rank normal form for A;
        • A~B <=> rank(A)=rank(B);
        • Corollary: Multiplication by nonsingular matrices cannot change rank;
          • rank(A^T)=rank(A);
          • rank(A*)=rank(A);
    7. LU factorization
      • Origin: Gaussian Elimination;
      • LU factorization: A=LU, L: lower triangular matrix, U: upper triangular matrix;
      • Observation of LU: Advantages of LU factorization:
        • L:

          • a lower triangular matrix;
          • 1's on the diagonal: means itself row plus other rows' multiplication with a scalar;
          • the entries below the diagonal record the multipliers used to eliminate;
        • U:
          • an upper triangular matrix;
          • the result of the elimination on A;
      • *L and U are unique;
        • proof: A=L1U1=L2U2 => L2'L1=U2U1', L2'L1 is a lower triangular matrix, U2U1' is an upper triangular matrix. They are equal to each other. So I=I => L2'L1=U2U1'=I.
      • *If exchanging of two rows is emerging during LU factorizing, the consistency of triangular form will be destroyed;
      • Advantages of LU factorization:
        • If only one system Ax=b need to be solved, the Gaussian Elimination is enough;
        • If more then one systems which coefficient matrices are the same need to be solved, the LU factorization is better;
        • Once the LU factors of A are known, any other system Ax=b can be solved in n^2 multiplications and n^2-n additions;
      • Existence of LU:
        • No zero pivot emerges during row reduction to upper triangular form with type III operation;
        • Another characterization method associated with principle submatrix: each leading principle submatrices is nonsingular;
      • PLU factorization: PA=LU;
      • LDU factorization: A=LDU, D=diag(u11, u22, ..., unn);
  3. Vector Spaces
    1. Spaces and subspaces

      • Vector space;
      • Scalar field F: R for real numbers and C for complex numbers;
    2. null

Matrix Analysis and Application的更多相关文章

  1. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.1

    Let $x,y,z$ be linearly independent vectors in $\scrH$. Find a necessary and sufficient condition th ...

  2. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]Contents

    I find it may cost me so much time in doing such solutions to exercises and problems....I am sorry t ...

  3. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.7

    For every matrix $A$, the matrix $$\bex \sex{\ba{cc} I&A\\ 0&I \ea} \eex$$ is invertible and ...

  4. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.10

    Every $k\times k$ positive matrix $A=(a_{ij})$ can be realised as a Gram matrix, i.e., vectors $x_j$ ...

  5. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.5

    Show that the inner product $$\bex \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots\vee y_k} \eex$$ is eq ...

  6. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.1

    Show that the inner product $$\bex \sef{x_1\wedge \cdots \wedge x_k,y_1\wedge \cdots\wedge y_k} \eex ...

  7. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.6

    Let $A$ and $B$ be two matrices (not necessarily of the same size). Relative to the lexicographicall ...

  8. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.4

    (1). There is a natural isomorphism between the spaces $\scrH\otimes \scrH^*$ and $\scrL(\scrH,\scrK ...

  9. [Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.8

    For any matrix $A$ the series $$\bex \exp A=I+A+\frac{A^2}{2!}+\cdots+\frac{A^n}{n!}+\cdots \eex$$ c ...

随机推荐

  1. C#中Finalize方法的问题

    ninputer在关于"值类型的Finalize不会被调用"中(http://blog.joycode.com/lijianzhong/archive/2005/01/13/429 ...

  2. UWP - 介绍App Service 与新功能

    App Service 是一种背景工作运行的服务,提供给其他Apps 使用就像Web Service.它本身无使用介面(UI-less),允许Apps 在同一个设备被引用,甚至Windows 10 1 ...

  3. linux(3)--------SSH工具的安装使用

    0.一般安装服务端的Linux ssh是默认安装的可以运行ssh localhost测试一下是否可以链接 1.SSH是什么 1)ssh:Secure Shell  安全外壳协议 2)建立在应用层基础上 ...

  4. 使用Visual Studio分析dump

    最近系统是不是CPU会飙升的百分之九十多甚至百分百,在本地又很难复现问题,无法定位问题出现在哪. 可以用转储文件来保存现场,然后通过分析dump文件可以大概分析出问题的所在 生成转存文件 在CPU飙升 ...

  5. 面试官问我MySQL索引,我

    面试官:我看你简历上写了MySQL,对MySQL InnoDB引擎的索引了解吗? 候选者:嗯啊,使用索引可以加快查询速度,其实上就是将无序的数据变成有序(有序就能加快检索速度) 候选者:在InnoDB ...

  6. iptables开启后造成本地套接字阻塞的问题

    前段时间,我使用iptables实现了针对IP地址与MAC地址的白名单功能,即将INPUT链的默认规则设为DROP: iptables -P INPUT DROP 这样就能拒绝一切外来报文.随后只需要 ...

  7. Shell 脚本如何输出帮助信息?

    作者展示了一个技巧,将帮助信息写在 Bash 脚本脚本的头部,然后只要执行"脚本名 + help",就能输出这段帮助信息 https://samizdat.dev/help-mes ...

  8. Linux串口调试详解

    测试平台 宿主机平台:Ubuntu 16.04.6 目标机:iMX6ULL 目标机内核:Linux 4.1.15 目标机添加串口设备 一般嵌入式主板的默认镜像可能只配置了调试串口,并用于 consol ...

  9. JS005. 拷贝引用数据类型Array使其指向不同堆的解决方案

    一个很常见的语法问题,但专注实现需求时经常会忘记去避免,导致最终问题的出现,再花时间排查.为此专门整理一篇解决方法的博客,也加强一下自己的记忆. TAG: JSON.parse() JSON.stri ...

  10. k8s架构与组件详解

    没有那么多花里胡哨,直接进行一个K8s架构与组件的学习. 一.K8s架构 k8s系统在设计是遵循c-s架构的,也就是我们图中apiserver与其余组件的交互.在生产中通常会有多个Master以实现K ...