Step 0: Load data

The starter code contains code to load 45 2D data points. When plotted using the scatter function, the results should look like the following:

Step 1: Implement PCA

In this step, you will implement PCA to obtain xrot, the matrix in which the data is "rotated" to the basis comprising made up of the principal components

Step 1a: Finding the PCA basis

Find and , and draw two lines in your figure to show the resulting basis on top of the given data points.

Step 1b: Check xRot

Compute xRot, and use the scatter function to check that xRot looks as it should, which should be something like the following:

Step 2: Dimension reduce and replot

In the next step, set k, the number of components to retain, to be 1

Step 3: PCA Whitening

Step 4: ZCA Whitening

Code

close all

%%================================================================
%% Step : Load data
% We have provided the code to load data from pcaData.txt into x.
% x is a * matrix, where the kth column x(:,k) corresponds to
% the kth data point.Here we provide the code to load natural image data into x.
% You do not need to change the code below. x = load('pcaData.txt','-ascii'); % 载入数据
figure();
scatter(x(, :), x(, :)); % 用圆圈绘制出数据分布
title('Raw data'); %%================================================================
%% Step 1a: Implement PCA to obtain U
% Implement PCA to obtain the rotation matrix U, which is the eigenbasis
% sigma. % -------------------- YOUR CODE HERE --------------------
u = zeros(size(x, )); % You need to compute this
[n m]=size(x);
% x=x-repmat(mean(x,),,m); %预处理,均值为零 —— 2维,每一维减去该维上的均值
sigma=(1.0/m)*x*x'; % 协方差矩阵
[u s v]=svd(sigma); % --------------------------------------------------------
hold on
plot([ u(,)], [ u(,)]); % 画第一条线
plot([ u(,)], [ u(,)]); % 画第二条线
scatter(x(, :), x(, :));
hold off %%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
% Now, compute xRot by projecting the data on to the basis defined
% by U. Visualize the points by performing a scatter plot. % -------------------- YOUR CODE HERE --------------------
xRot = zeros(size(x)); % You need to compute this
xRot=u'*x; % -------------------------------------------------------- % Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure();
scatter(xRot(, :), xRot(, :));
title('xRot'); %%================================================================
%% Step : Reduce the number of dimensions from to .
% Compute xRot again (this time projecting to dimension).
% Then, compute xHat by projecting the xRot back onto the original axes
% to see the effect of dimension reduction % -------------------- YOUR CODE HERE --------------------
k = ; % Use k = and project the data onto the first eigenbasis
xHat = zeros(size(x)); % You need to compute this
xHat = u*([u(:,),zeros(n,)]'*x); % 降维
% 使特征点落在特征向量所指的方向上而不是原坐标系上 % --------------------------------------------------------
figure();
scatter(xHat(, :), xHat(, :));
title('xHat'); %%================================================================
%% Step : PCA Whitening
% Complute xPCAWhite and plot the results. epsilon = 1e-;
% -------------------- YOUR CODE HERE --------------------
xPCAWhite = zeros(size(x)); % You need to compute this
xPCAWhite = diag(./sqrt(diag(s)+epsilon))*u'*x; % 每个特征除以对应的特征向量,以使每个特征有一致的方差
% --------------------------------------------------------
figure();
scatter(xPCAWhite(, :), xPCAWhite(, :));
title('xPCAWhite'); %%================================================================
%% Step : ZCA Whitening
% Complute xZCAWhite and plot the results. % -------------------- YOUR CODE HERE --------------------
xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u*diag(./sqrt(diag(s)+epsilon))*u'*x; % --------------------------------------------------------
figure();
scatter(xZCAWhite(, :), xZCAWhite(, :));
title('xZCAWhite'); %% Congratulations! When you have reached this point, you are done!
% You can now move onto the next PCA exercise. :)

Exercise: PCA in 2D的更多相关文章

  1. 【DeepLearning】Exercise:PCA in 2D

    Exercise:PCA in 2D 习题的链接:Exercise:PCA in 2D pca_2d.m close all %%=================================== ...

  2. 【DeepLearning】Exercise:PCA and Whitening

    Exercise:PCA and Whitening 习题链接:Exercise:PCA and Whitening pca_gen.m %%============================= ...

  3. Deep Learning 4_深度学习UFLDL教程:PCA in 2D_Exercise(斯坦福大学深度学习教程)

    前言 本节练习的主要内容:PCA,PCA Whitening以及ZCA Whitening在2D数据上的使用,2D的数据集是45个数据点,每个数据点是2维的.要注意区别比较二维数据与二维图像的不同,特 ...

  4. UFLDL教程笔记及练习答案二(预处理:主成分分析和白化)

    首先将本节主要内容记录下来.然后给出课后习题的答案. 笔记: :首先我想推导用SVD求解PCA的合理性. PCA原理:如果样本数据X∈Rm×n.当中m是样本数量,n是样本的维数.PCA降维的目的就是为 ...

  5. Deep Learning 教程(斯坦福深度学习研究团队)

    http://www.zhizihua.com/blog/post/602.html 说明:本教程将阐述无监督特征学习和深度学习的主要观点.通过学习,你也将实现多个功能学习/深度学习算法,能看到它们为 ...

  6. [Scikit-learn] 4.3 Preprocessing data

    数据分析的重难点,就这么来了,欢迎欢迎,热烈欢迎. 4. Dataset transformations 4.3. Preprocessing data 4.3.1. Standardization, ...

  7. UFLDL教程之(三)PCA and Whitening exercise

    Exercise:PCA and Whitening 第0步:数据准备 UFLDL下载的文件中,包含数据集IMAGES_RAW,它是一个512*512*10的矩阵,也就是10幅512*512的图像 ( ...

  8. PCA and kmeans MATLAB实现

    MATLAB基础知识 l  Imread:  读取图片信息: l  axis:轴缩放:axis([xmin xmax ymin ymax zmin zmax cmin cmax]) 设置 x.y 和  ...

  9. Deep Learning 5_深度学习UFLDL教程:PCA and Whitening_Exercise(斯坦福大学深度学习教程)

    前言 本文是基于Exercise:PCA and Whitening的练习. 理论知识见:UFLDL教程. 实验内容:从10张512*512自然图像中随机选取10000个12*12的图像块(patch ...

随机推荐

  1. cnmp安装失败,报错npm ERR! enoent ENOENT: no such file or directory,

    1.cnmp安装失败 2.提示如下: bogon:node_modules liangjingming$ sudo npm install cnpm -g --registry=https://reg ...

  2. SVN: repository browser 库浏览器

    SVN: repository browser  库浏览器 -----如果不想全部下载,可以通过repository browser  库浏览器 从库中选择要下载的文件夹内容下载(svn针对性下载)

  3. 976 C. Nested Segments

    You are given a sequence a1, a2, ..., an of one-dimensional segments numbered 1 through n. Your task ...

  4. Navicat for Oracle

    1.先解压Navicat for Oracle到任意目录 2.将instantclient-basic-nt-12.1.0.2.0解压到1中目录的instantclient_10_2文件夹下(推荐,可 ...

  5. LightOJ-1341 Aladdin and the Flying Carpet 分解质因数(注意对大素数的优化)

    题目链接:https://cn.vjudge.net/problem/LightOJ-1341 题意 给出一个长方形的面积a 让你算整数边长的可能取值,并且两个边都大于给定数字b 思路 唯一分解定理: ...

  6. NodeJS代码调试

    1.在Chrome打开chrome://flags/#enable-devtools-experiments 2.激活Developer Tools experiments 3.重启Chrome 4. ...

  7. PostgreSQL创建只读用户

    创建用户及指定密码: CREATE USER readonly WITH ENCRYPTED PASSWORD 'ropass'; 设置用户默认事务只读: alter user readonly se ...

  8. docker常用命令,学习笔记

    - 常用命令 https://docs.docker.com images > docker images # 查看本地镜像 > docker images -a # 查看所(含中间镜像层 ...

  9. 第五讲 自对偶的Yang-Mills方程及Polyakov和t'Hooft解

    $\newcommand{\R}{\mathbb{R}}$以下我们考虑的是$\R^4$或者$S^4$上的Yang-Mills泛函,它们是共形不变的. 一.自对偶和反自对偶 我们寻找$\R^4$或$S^ ...

  10. CodeForces 383C Propagating tree

    Propagating tree Time Limit: 2000ms Memory Limit: 262144KB This problem will be judged on CodeForces ...