1. Feedforward and cost function;

2.Regularized cost function:

3.Sigmoid gradient

The gradient for the sigmoid function can be computed as:

where:

4.Random initialization

randInitializeWeights.m

 function W = randInitializeWeights(L_in, L_out)
%RANDINITIALIZEWEIGHTS Randomly initialize the weights of a layer with L_in
%incoming connections and L_out outgoing connections
% W = RANDINITIALIZEWEIGHTS(L_in, L_out) randomly initializes the weights
% of a layer with L_in incoming connections and L_out outgoing
% connections.
%
% Note that W should be set to a matrix of size(L_out, + L_in) as
% the column row of W handles the "bias" terms
% % You need to return the following variables correctly
W = zeros(L_out, + L_in); % ====================== YOUR CODE HERE ======================
% Instructions: Initialize W randomly so that we break the symmetry while
% training the neural network.
%
% Note: The first row of W corresponds to the parameters for the bias units
%
epsilon_init = 0.12;
W = rand(L_out, + L_in) * * epsilon_init - epsilon_init; % ========================================================================= end

5.Backpropagation(using a for-loop for t=1:m and place steps 1-4 below inside the for-loop), with the tth iteration perfoming the calculation on the tth training example(x(t),y(t)).Step 5 will divide the accumulated gradients by m to obtain the gradients for the neural network cost function.

(1) Set the input layer's values(a(1)) to the t-th training example x(t). Perform a feedforward pass, computing the activations(z(2),a(2),z(3),a(3)) for layers 2 and 3.

(2) For each output unit k in layer 3(the output layer), set :

where yk = 1 or 0.

(3)For the hidden layer l=2, set:

(4) Accumulate the gradient from this example using the following formula. Note that you should skip or remove δ0(2).

(5) Obtain the(unregularized) gradient for the neural network cost function by dividing the accumulated gradients by 1/m:

nnCostFunction.m

 function [J grad] = nnCostFunction(nn_params, ...
input_layer_size, ...
hidden_layer_size, ...
num_labels, ...
X, y, lambda)
%NNCOSTFUNCTION Implements the neural network cost function for a two layer
%neural network which performs classification
% [J grad] = NNCOSTFUNCTON(nn_params, hidden_layer_size, num_labels, ...
% X, y, lambda) computes the cost and gradient of the neural network. The
% parameters for the neural network are "unrolled" into the vector
% nn_params and need to be converted back into the weight matrices.
%
% The returned parameter grad should be a "unrolled" vector of the
% partial derivatives of the neural network.
% % Reshape nn_params back into the parameters Theta1 and Theta2, the weight matrices
% for our layer neural network
Theta1 = reshape(nn_params(:hidden_layer_size * (input_layer_size + )), ...
hidden_layer_size, (input_layer_size + )); Theta2 = reshape(nn_params(( + (hidden_layer_size * (input_layer_size + ))):end), ...
num_labels, (hidden_layer_size + )); % Setup some useful variables
m = size(X, ); % You need to return the following variables correctly
J = ;
Theta1_grad = zeros(size(Theta1));
Theta2_grad = zeros(size(Theta2)); % ====================== YOUR CODE HERE ======================
% Instructions: You should complete the code by working through the
% following parts.
%
% Part : Feedforward the neural network and return the cost in the
% variable J. After implementing Part , you can verify that your
% cost function computation is correct by verifying the cost
% computed in ex4.m
%
% Part : Implement the backpropagation algorithm to compute the gradients
% Theta1_grad and Theta2_grad. You should return the partial derivatives of
% the cost function with respect to Theta1 and Theta2 in Theta1_grad and
% Theta2_grad, respectively. After implementing Part , you can check
% that your implementation is correct by running checkNNGradients
%
% Note: The vector y passed into the function is a vector of labels
% containing values from ..K. You need to map this vector into a
% binary vector of 's and 0's to be used with the neural network
% cost function.
%
% Hint: We recommend implementing backpropagation using a for-loop
% over the training examples if you are implementing it for the
% first time.
%
% Part : Implement regularization with the cost function and gradients.
%
% Hint: You can implement this around the code for
% backpropagation. That is, you can compute the gradients for
% the regularization separately and then add them to Theta1_grad
% and Theta2_grad from Part .
% %Part
%Theta1 has size *
%Theta2 has size *
%y hase size *
K = num_labels;
Y = eye(K)(y,:); %[ ]
a1 = [ones(m,),X];%[ ]
a2 = sigmoid(a1*Theta1'); %[5000 25]
a2 = [ones(m,),a2];%[ ]
h = sigmoid(a2*Theta2');%[5000 10] costPositive = -Y.*log(h);
costNegtive = (-Y).*log(-h);
cost = costPositive - costNegtive;
J = (/m)*sum(cost(:));
%Regularized
Theta1Filtered = Theta1(:,:end); %[ ]
Theta2Filtered = Theta2(:,:end); %[ ]
reg = (lambda/(*m))*(sumsq(Theta1Filtered(:))+sumsq(Theta2Filtered(:)));
J = J + reg; %Part
Delta1 = ;
Delta2 = ;
for t=:m,
%step
a1 = [ X(t,:)]; %[ ]
z2 = a1*Theta1'; %[1 25]
a2 = [ sigmoid(z2)];%[ ]
z3 = a2*Theta2'; %[1 10]
a3 = sigmoid(z3); %[ ]
%step
yt = Y(t,:);%[ ]
d3 = a3-yt; %[ ]
%step
% [ ] [ ] [ ]
d2 = (d3*Theta2Filtered).*sigmoidGradient(z2); %[ ]
%step
Delta1 = Delta1 + (d2'*a1);%[25 401]
Delta2 = Delta2 + (d3'*a2);%[10 26]
end; %step
Theta1_grad = (/m)*Delta1;
Theta2_grad = (/m)*Delta2; %Part
Theta1_grad(:,:end) = Theta1_grad(:,:end) + ((lambda/m)*Theta1Filtered);
Theta2_grad(:,:end) = Theta2_grad(:,:end) + ((lambda/m)*Theta2Filtered); % ------------------------------------------------------------- % ========================================================================= % Unroll gradients
grad = [Theta1_grad(:) ; Theta2_grad(:)]; end

6.Gradient checking

Let

and

for each i, that:

computeNumericalGradient.m

 function numgrad = computeNumericalGradient(J, theta)
%COMPUTENUMERICALGRADIENT Computes the gradient using "finite differences"
%and gives us a numerical estimate of the gradient.
% numgrad = COMPUTENUMERICALGRADIENT(J, theta) computes the numerical
% gradient of the function J around theta. Calling y = J(theta) should
% return the function value at theta. % Notes: The following code implements numerical gradient checking, and
% returns the numerical gradient.It sets numgrad(i) to (a numerical
% approximation of) the partial derivative of J with respect to the
% i-th input argument, evaluated at theta. (i.e., numgrad(i) should
% be the (approximately) the partial derivative of J with respect
% to theta(i).)
% numgrad = zeros(size(theta));
perturb = zeros(size(theta));
e = 1e-;
for p = :numel(theta)
% Set perturbation vector
perturb(p) = e;
loss1 = J(theta - perturb);
loss2 = J(theta + perturb);
% Compute Numerical Gradient
numgrad(p) = (loss2 - loss1) / (*e);
perturb(p) = ;
end end

7.Regularized Neural Networks

for j=0:

for j>=1:

别人的代码:

https://github.com/jcgillespie/Coursera-Machine-Learning/tree/master/ex4

CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)的更多相关文章

  1. CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction

    Handwritten digits recognition (0-9) Multi-class Logistic Regression 1. Vectorizing Logistic Regress ...

  2. CheeseZH: Stanford University: Machine Learning Ex5:Regularized Linear Regression and Bias v.s. Variance

    源码:https://github.com/cheesezhe/Coursera-Machine-Learning-Exercise/tree/master/ex5 Introduction: In ...

  3. CheeseZH: Stanford University: Machine Learning Ex2:Logistic Regression

    1. Sigmoid Function In Logisttic Regression, the hypothesis is defined as: where function g is the s ...

  4. CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression

    (1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...

  5. Machine Learning, Homework 9, Neural Nets

    Machine Learning, Homework 9, Neural NetsApril 15, 2019ContentsBoston Housing with a Single Layer an ...

  6. 【MetaPruning】2019-ICCV-MetaPruning Meta Learning for Automatic Neural Network Channel Pruning-论文阅读

    MetaPruning 2019-ICCV-MetaPruning Meta Learning for Automatic Neural Network Channel Pruning Zechun ...

  7. MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning

    MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning 2019-08-11 19:48:17 Paper: h ...

  8. Stanford CS229 Machine Learning by Andrew Ng

    CS229 Machine Learning Stanford Course by Andrew Ng Course material, problem set Matlab code written ...

  9. Machine Learning No.5: Neural networks

    1. advantage: when number of features is too large, so previous algorithm is not a good way to learn ...

随机推荐

  1. HDU 5699 货物运输 二分

    货物运输 题目连接: http://acm.hdu.edu.cn/showproblem.php?pid=5699 Description 公元2222年,l国发生了一场战争. 小Y负责领导工人运输物 ...

  2. Codeforces Round #351 (VK Cup 2016 Round 3, Div. 2 Edition) C. Bear and Colors 暴力

    C. Bear and Colors 题目连接: http://www.codeforces.com/contest/673/problem/C Description Bear Limak has ...

  3. Codeforces Round #276 div1 B. Maximum Value Hash 乱搞

    #include <cstdio> #include <cmath> #include <cstring> #include <ctime> #incl ...

  4. 匹配<a href="url">content</a>

    grep -Po '<a href="(.*.rmvb")>(.*)</a>' te.txt | sed -n 's/<a href=\(.*\)&g ...

  5. 获取数据库连接对象(线程ThreadLocal)

    /** * 负责数据库连接定义的程序类 * 该类可以负责所有操作线程的数据库连接,利用get()方法可以获得连接对象 */ public class DatabaseConnection { priv ...

  6. Android 菜单键和返回键互换

    打开RE管理器找到system/usr/keylayout/ 长按qwerty.kl选择以文本编辑器查看 将里面的MENU和BACK全部替换掉 保存,退出管理器,重启手机,菜单键和返回键的位置就调换过 ...

  7. win7 系统盘怎样瘦身! 可整理出4-5G。

    1.移走虚拟内存文件到非系统盘 大家都知道,为了加快系统的执行,Windows提供了虚拟内存机制,而在Windows7中,默认是开启这项功能的,并且虚拟内存文件在系统盘.比方一台2G内存的机器,虚拟内 ...

  8. linux shell if语句使用方法

    原文地址:http://blog.chinaunix.net/uid-24607609-id-2118151.html 最精简的 if 命令的语法是: if TEST-COMMANDS; then C ...

  9. 模拟出ios中流行的黑色背景底

    [activityIndicatorView setBackgroundColor:[UIColor colorWithRed: green: blue: alpha:0.4]]; 用上面的语法,可以 ...

  10. bt z

    比特币从2008年开始启动,到09年创始区块的出现,甚至一直到10年和11年都只是中本聪自己一个人在运行这套程序.在早期这个极少数人参与到游戏里,大家运行一个软件,这个软件既是钱包也是挖矿软件,进行P ...