特征向量-Eigenvalues_and_eigenvectors#Graphs
https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Graphs
A               {\displaystyle A}   ,它的特征向量(eigenvector,也译固有向量或本征向量)                     v               {\displaystyle v}   
 经过这个线性变换[1]之后,得到的新向量仍然与原来的                     v               {\displaystyle v}   
 保持在同一条直线上,但其长度或方向也许会改变。即
A               {\displaystyle A}   ,它的特征向量(eigenvector,也译固有向量或本征向量)                     v               {\displaystyle v}   
 经过这个线性变换[1]之后,得到的新向量仍然与原来的                     v               {\displaystyle v}   
 保持在同一条直线上,但其长度或方向也许会改变。即
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it. More formally, if T is a linear transformation from a vector space V over a field F into itself and v is a vector in V that is not the zero vector, then v is an eigenvector of T if T(v) is a scalar multiple of v. This condition can be written as the equation
-                      T         (                   v                 )         =         λ                   v                 ,               {\displaystyle T(\mathbf {v} )=\lambda \mathbf {v} ,}   
 
where λ is a scalar in the field F, known as the eigenvalue, characteristic value, or characteristic root associated with the eigenvector v.
If the vector space V is finite-dimensional, then the linear transformation T can be represented as a square matrix A, and the vector v by a column vector, rendering the above mapping as a matrix multiplication on the left hand side and a scaling of the column vector on the right hand side in the equation
-                      A                   v                 =         λ                   v                 .               {\displaystyle A\mathbf {v} =\lambda \mathbf {v} .}   
 
There is a correspondence between n by n square matrices and linear transformations from an n-dimensional vector space to itself. For this reason, it is equivalent to define eigenvalues and eigenvectors using either the language of matrices or the language of linear transformations.[1][2]
Geometrically an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction that is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.[3]
-                      A         v         =         λ         v               {\displaystyle Av=\lambda v}   
,
 
λ               {\displaystyle \lambda }   为标量,即特征向量的长度在该线性变换下缩放的比例,称                     λ               {\displaystyle \lambda }   
 为其特征值(本征值)。如果特征值为正,则表示                     v               {\displaystyle v}   
 在经过线性变换的作用后方向也不变;如果特征值为负,说明方向会反转;如果特征值为0,则是表示缩回零点。但无论怎样,仍在同一条直线上。
-                      A         v         =         λ         v               {\displaystyle Av=\lambda v}   
,
 
λ               {\displaystyle \lambda }   为标量,即特征向量的长度在该线性变换下缩放的比例,称                     λ               {\displaystyle \lambda }   
 为其特征值(本征值)。如果特征值为正,则表示                     v               {\displaystyle v}   
 在经过线性变换的作用后方向也不变;如果特征值为负,说明方向会反转;如果特征值为0,则是表示缩回零点。但无论怎样,仍在同一条直线上。
特征向量-Eigenvalues_and_eigenvectors#Graphs的更多相关文章
- 特征向量-Eigenvalues_and_eigenvectors#Graphs 线性变换
		
总结: 1.线性变换运算封闭,加法和乘法 2.特征向量经过线性变换后方向不变 https://en.wikipedia.org/wiki/Linear_map Examples of linear t ...
 - Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
		
Defferrard, Michaël, Xavier Bresson, and Pierre Vandergheynst. "Convolutional neural networks o ...
 - 论文解读二代GCN《Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering》
		
Paper Information Title:Convolutional Neural Networks on Graphs with Fast Localized Spectral Filteri ...
 - 论文解读《The Emerging Field of Signal Processing on Graphs》
		
感悟 看完图卷积一代.二代,深感图卷积的强大,刚开始接触图卷积的时候完全不懂为什么要使用拉普拉斯矩阵( $L=D-W$),主要是其背后的物理意义.通过借鉴前辈们的论文.博客.评论逐渐对图卷积有了一定的 ...
 - 论文解读(AutoSSL)《Automated Self-Supervised Learning for Graphs》
		
论文信息 论文标题:Automated Self-Supervised Learning for Graphs论文作者:Wei Jin, Xiaorui Liu, Xiangyu Zhao, Yao ...
 - 论文解读(MGAE)《MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs》
		
论文信息 论文标题:MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs论文作者:Qiaoyu Tan, Ninghao L ...
 - 论文阅读 Inductive Representation Learning on Temporal Graphs
		
12 Inductive Representation Learning on Temporal Graphs link:https://arxiv.org/abs/2002.07962 本文提出了时 ...
 - 知识图谱顶刊综述 - (2021年4月) A Survey on Knowledge Graphs: Representation, Acquisition, and Applications
		
知识图谱综述(2021.4) 论文地址:A Survey on Knowledge Graphs: Representation, Acquisition, and Applications 目录 知 ...
 - PCA 协方差矩阵特征向量的计算
		
人脸识别中矩阵的维数n>>样本个数m. 计算矩阵A的主成分,根据PCA的原理,就是计算A的协方差矩阵A'A的特征值和特征向量,但是A'A有可能比较大,所以根据A'A的大小,可以计算AA'或 ...
 
随机推荐
- 电赛总结(四)——波形发生芯片总结之AD9834
			
一.特性参数 1.2.3V~5.5V供电 2.输出频率高达37.5MHz 3.正弦波.三角波输出 4.提供相位调制和频率调制功能 5.除非另有说明,VDD = 2.3 V至5.5 V,AGND = D ...
 - 电赛初探(二)——语音采集回放系统
			
一.系统结构 1.基本要求 (1)话音/功率放大器增益均可调: (2)带通滤波器:通带为300Hz-3.4kHz : (3)ADC:采样频率f s=8kHz,字长不小于8位: (4)语音存储时间≥10 ...
 - C# 新技巧(一)
			
概述:所有代码均来自MVC源码的阅读.实际上,也是框架开发中常用的技巧. 1.使用Empty模式处理空对象 return Enumerable.Empty<ModelValidationResu ...
 - ! cocos2d sprintf的问题
			
char dong[100]; sprintf(dong,"%s","dongshen"); CCLOG(dong); 以上正确 char dong[100]; ...
 - BZOJ3780 : 数字统计
			
从低位到高位数位DP,f[i][j][k]表示已经填了后i位,转化的数字为j,后i位与x后i位的大小关系为k的方案数. #include<cstdio> const int N=202,B ...
 - BZOJ 2282 & 树的直径
			
SDOI2011的Dayx第2题 题意: 在树中找到一条权值和不超过S的链(为什么是链呢,因为题目中提到“使得路径的两端都是城市”,如果不是链那不就不止两端了吗——怎么这么机智的感觉...),使得不在 ...
 - TYVJ P1007 排座椅 Label:多想想输出 水
			
背景 NOIP2008年普及组第二题 描述 上课的时候总有一些同学和前后左右的人交头接耳,这是令小学班主任十分头疼的一件事情.不过,班主任小雪发现了一些有趣的现象,当同学们的座次确定下来之后,只 ...
 - Ubuntu根目录下各文件夹的功能详细介绍
			
Ubuntu的根目录下存在着很多的文件夹,但你知道他们都存放着哪些文件呢?这些是深入了解Ubuntu系统必不缺少的知识,本文就关于此做一下介绍吧. /bin/ 用以存储二进制可执行命令文件. / ...
 - NHibernate's inverse - what does it really mean?
			
NHibernate's concept of 'inverse' in relationships is probably the most often discussed and misunder ...
 - (转)微信公众平台开发之基于百度 BAE3.0 的开发环境搭建(采用 Baidu Eclipse)
			
原文传送门(http://blog.csdn.net/bingtianxuelong/article/details/17843111) 版本说明: V1: 2014-2-13 ...