CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction
Handwritten digits recognition (0-9)
Multi-class Logistic Regression
1. Vectorizing Logistic Regression
(1) Vectorizing the cost function
(2) Vectorizing the gradient
(3) Vectorizing the regularized cost function
(4) Vectorizing the regularized gradient
All above 4 formulas can be found in the previous blog: click here.
lrCostFunction.m
function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters. % Initialize some useful values
m = length(y); % number of training examples % You need to return the following variables correctly
J = ;
grad = zeros(size(theta)); % ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
% efficiently vectorized. For example, consider the computation
%
% sigmoid(X * theta)
%
% Each row of the resulting matrix will contain the value of the
% prediction for that example. You can make use of this to vectorize
% the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
% there're many possible vectorized solutions, but one solution
% looks like:
% grad = (unregularized gradient for logistic regression)
% temp = theta;
% temp() = ; % because we don't add anything for j = 0
% grad = grad + YOUR_CODE_HERE (using the temp variable)
% hx = sigmoid(X*theta);
reg = lambda/(*m)*sum(theta(:size(theta),:).^);
J = -/m*(y'*log(hx)+(1-y)'*log(-hx)) + reg;
theta() = ;
grad = /m*X'*(hx-y)+lambda/m*theta; % ============================================================= grad = grad(:); end
2. One-vs-all Classification (Training)
Return all the classifier parameters in a matrix Θ (a K x N+1 matrix, K is the num_labels and N is the num_features ), where each row of Θ corresponds to the learned logistic regression parameters for one class. You can do this with a 'for'-loop from 1 to K, training each classifier independently.
oneVsAll.m
function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
% [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
% logisitc regression classifiers and returns each of these classifiers
% in a matrix all_theta, where the i-th row of all_theta corresponds
% to the classifier for label i % Some useful variables
m = size(X, );
n = size(X, ); % You need to return the following variables correctly
all_theta = zeros(num_labels, n + ); % Add ones to the X data matrix
X = [ones(m, ) X]; % ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
% logistic regression classifiers with regularization
% parameter lambda.
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 's and 0's that tell use
% whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
% function. It is okay to use a for-loop (for c = :num_labels) to
% loop over the different classes.
%
% fmincg works similarly to fminunc, but is more efficient when we
% are dealing with large number of parameters.
%
% Example Code for fmincg:
%
% % Set Initial theta
% initial_theta = zeros(n + , );
%
% % Set options for fminunc
% options = optimset('GradObj', 'on', 'MaxIter', );
%
% % Run fmincg to obtain the optimal theta
% % This function will return theta and the cost
% [theta] = ...
% fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
% initial_theta, options);
% for c=:num_labels,
initial_theta = all_theta(c,:)';
options = optimset('GradObj','on','MaxIter',);
theta = fmincg(@(t)(lrCostFunction(t,X,(y==c),lambda)),initial_theta,options);
all_theta(c,:) = theta';
end; % ========================================================================= end
3. One-vs-all Classification (Prediction)
predictOneVsAll.m
Neural Network Prediction
Feedword Propagation and Prediction
predict.m
function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
% p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
% trained weights of a neural network (Theta1, Theta2) % Useful values
m = size(X, );
num_labels = size(Theta2, ); % You need to return the following variables correctly
p = zeros(size(X, ), ); % ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned neural network. You should set p to a
% vector containing labels between to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
% function can also return the index of the max element, for more
% information see 'help max'. If your examples are in rows, then, you
% can use max(A, [], ) to obtain the max for each row.
%
a1 = X; %*
a1 = [ones(size(X,), ),X]; %*
a2 = sigmoid(a1*Theta1');%5000*25
a2 = [ones(size(a2,),),a2]; %*
a3 = sigmoid(a2*Theta2');%5000*10
[tmp,p] = max(a3,[],);
% ========================================================================= end
Other files and dataset can be download in Coursera.
CheeseZH: Stanford University: Machine Learning Ex3: Multiclass Logistic Regression and Neural Network Prediction的更多相关文章
- CheeseZH: Stanford University: Machine Learning Ex5:Regularized Linear Regression and Bias v.s. Variance
源码:https://github.com/cheesezhe/Coursera-Machine-Learning-Exercise/tree/master/ex5 Introduction: In ...
- CheeseZH: Stanford University: Machine Learning Ex2:Logistic Regression
1. Sigmoid Function In Logisttic Regression, the hypothesis is defined as: where function g is the s ...
- CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression
(1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput t ...
- CheeseZH: Stanford University: Machine Learning Ex4:Training Neural Network(Backpropagation Algorithm)
1. Feedforward and cost function; 2.Regularized cost function: 3.Sigmoid gradient The gradient for t ...
- [Machine Learning]学习笔记-Logistic Regression
[Machine Learning]学习笔记-Logistic Regression 模型-二分类任务 Logistic regression,亦称logtic regression,翻译为" ...
- Andrew Ng Machine Learning 专题【Logistic Regression & Regularization】
此文是斯坦福大学,机器学习界 superstar - Andrew Ng 所开设的 Coursera 课程:Machine Learning 的课程笔记. 力求简洁,仅代表本人观点,不足之处希望大家探 ...
- 机器学习---朴素贝叶斯与逻辑回归的区别(Machine Learning Naive Bayes Logistic Regression Difference)
朴素贝叶斯与逻辑回归的区别: 朴素贝叶斯 逻辑回归 生成模型(Generative model) 判别模型(Discriminative model) 对特征x和目标y的联合分布P(x,y)建模,使用 ...
- machine learning(10) -- classification:logistic regression cost function 和 使用 gradient descent to minimize cost function
logistic regression cost function(single example) 图像分布 logistic regression cost function(m examples) ...
- Machine Learning in Action -- Logistic regression
这个系列,重点关注如何实现,至于算法基础,参考Andrew的公开课 相较于线性回归,logistic回归更适合用于分类 因为他使用Sigmoid函数,因为分类的取值是0,1 对于分类,最完美和自然的函 ...
随机推荐
- Codeforces Round #356 (Div. 2) E. Bear and Square Grid 滑块
E. Bear and Square Grid 题目连接: http://www.codeforces.com/contest/680/problem/E Description You have a ...
- Codeforces Round #256 (Div. 2) E Divisors
E. Divisors Bizon the Champion isn't just friendly, he also is a rigorous coder. Let's define functi ...
- 预防Redis缓存穿透、缓存雪崩解决方案
最近面试中遇到redis缓存穿透.缓存雪崩等问题,特意了解下. redis缓存穿透: 缓存穿透是指用户查询数据,在数据库没有,自然在缓存中也不会有.这样就导致用户查询的时候,在缓存中找不到,每次都要去 ...
- JSONP跨域访问百度实现搜索提示小案例
一.JSONP简介 JSONP 全称 JSON with padding(填充式 JSON 或参数式 JSON),JSONP实现跨域请求的原理,就是动态创建<script>标签,然后利用& ...
- linux 内核升级 网址参考
http://blog.csdn.net/zklth/article/category/826447 http://blog.chinaunix.net/uid-28392723-id-3520177 ...
- 实效云计算用户组(ECUG) 与 阿里云
http://www.ecug.org/ http://www.aliyun.com/ 阿里云
- hibernate一级缓存,二级缓存和查询缓存
一级缓存 (必然存在) session里共享缓存,伴随session的生命周期存在和消亡: 1. load查询实体支持一级缓存 2. get查询实体对象也支持 3. save保存的实体对象会缓存 ...
- 找回 : MobileCoreServices.framework
MobileCoreServices.framework 丢失后,可通过如下方式找回: 1.在同事机器上拷贝一个. 路径: 2.重装一个xcode 实践:将xcode4.5下的文件拷到xcode4 ...
- iOS中安全结束 子线程 的方法
一个典型的结束子线程的方法: 用 isFinished 检测子线程是否被完全kill掉 -(IBAction)btnBack:(id)sender { //释放内存 仅仅remove 并不会触发内 ...
- 打印后台程序服务没有启动,每次打开Powerdesigner都会要我安装打印机
原因: 不光这个,就是word也需要你有个打印机.随便安一个就可以了.一般系统自带个Microsoft Office Document Image Writer的还报打印机,要是你有这个打印机的话.查 ...