Logistic/Softmax Regression
辅助函数




牛顿法介绍




%% Logistic Regression
close all
clear %%load data
x = load('ex4x.dat');
y = load('ex4y.dat'); [m, n] = size(x); % Add intercept term to x
x = [ones(m, ), x]; %%draw picture
% find returns the indices of the
% rows meeting the specified condition
pos = find(y == );
neg = find(y == );
% Assume the features are in the 2nd and 3rd
% columns of x
figure('NumberTitle', 'off', 'Name', 'GD');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); % Define the sigmoid function
g = inline('1 ./ (1 + exp(-z))'); alpha = 0.001;
theta = [-,,]';
obj_old = 1e10;
tor = 1e-; tic %%Gradient Descent
for time = :
delta = zeros(,);
objective = ; for i = :
z = x(i,:) * theta;
h = g(z);%转换成logistic函数
delta = (/m) .* x(i,:)' * (y(i)-h) + delta;
objective = (/m) .*( -y(i) * log(h) - (-y(i)) * log(-h)) + objective;
end
theta = theta + alpha * delta; fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
pause();
%%SGD figure('NumberTitle', 'off', 'Name', 'SGD');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); alpha = 0.001;
theta = [-,,]';
obj_old = 1e10;
tor = 1e-;
k=;
U=ceil(m/k); for time = :
delta = zeros(,);
rand('twister',time*);
idx=randperm(m);
objective = ; subidx=idx(:k);
for i=:length(subidx)
z = x(subidx(i),:) * theta;
h = g(z);%转换成logistic函数
delta = (/k) .* x(subidx(i),:)' * (y(subidx(i))-h) + delta;
objective = (/k) .*( -y(subidx(i)) * log(h) - (-y(subidx(i))) * log(-h)) + objective;
end
theta = theta + alpha * delta; fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
pause() %%Newton's method figure('NumberTitle', 'off', 'Name', 'Newton');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); alpha = 0.001;
theta = zeros(, );
obj_old = 1e10;
tor = 1e-; for i = :
delta = zeros(,);
delta_H = zeros(,);
objective = ;
% Calculate the hypothesis function
for i = :
z = x(i,:) * theta;
h = g(z);%转换成logistic函数
delta = (/m) .* x(i,:)' * (h-y(i)) + delta;
delta_H = (/m).* x(i,:)' * h * (1-h) * x(i,:) + delta_H;
objective = (/m) .*( -y(i) * log(h) - (-y(i)) * log(-h)) + objective;
end
theta = theta - delta_H\delta;
fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
%% Softmax Regression
close all
clear %%load data
load('my_ex4x.mat');
load('my_ex4y.mat'); [m, n] = size(x); % Add intercept term to x
x = [ones(m, ), x];
y = y + ; class_num = max(y);
n = n + ; %%draw picture
% find returns the indices of the
% rows meeting the specified condition
class2 = find(y == );
class1 = find(y == );
class3 = find(y == );
% Assume the features are in the 2nd and 3rd
% columns of x
figure('NumberTitle', 'off', 'Name', 'GD');
plot(x(class2, ), x(class2,), '+');
hold on;
plot(x(class1, ), x(class1, ), 'o');
hold on;
plot(x(class3, ), x(class3, ), '*');
hold on; % Define the sigmoid function
g = inline('exp(z) ./ sumz','z','sumz'); alpha = 0.0001;
theta = [-,0.15,0.14;-,,-]';
obj_old = 1e10;
tor = 1e-; %%Gradient Descent
for time = :
delta = zeros(,);
objective = ; for i = :
for j = :
z = x(i,:) * theta(:,j);
sumz = exp(x(i,:) * theta(:,)) + exp(x(i,:) * theta(:,)) + ;
h = g(z,sumz);%转换成logistic函数
if y(i)==j
delta = (/m) .* x(i,:)' * (1-h);
theta(:,j) = theta(:,j) + alpha * delta;
objective = (/m) .*(-y(i) * log(h)) + objective;
else
delta = (/m) .* x(i,:)' * (-h);
theta(:,j) = theta(:,j) + alpha * delta;
objective = (/m) .*(-(-y(i)) * log(-h)) + objective;
end
end
end fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta(,)).*(theta(,).*plot_x +theta(,));
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold on plot_y = (-./theta(,)).*(theta(,).*plot_x +theta(,));
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
Logistic/Softmax Regression的更多相关文章
- 机器学习方法(五):逻辑回归Logistic Regression,Softmax Regression
欢迎转载,转载请注明:本文出自Bin的专栏blog.csdn.net/xbinworld. 技术交流QQ群:433250724,欢迎对算法.技术.应用感兴趣的同学加入. 前面介绍过线性回归的基本知识, ...
- Softmax回归(Softmax Regression)
转载请注明出处:http://www.cnblogs.com/BYRans/ 多分类问题 在一个多分类问题中,因变量y有k个取值,即.例如在邮件分类问题中,我们要把邮件分为垃圾邮件.个人邮件.工作邮件 ...
- TensorFlow实战之Softmax Regression识别手写数字
关于本文说明,本人原博客地址位于http://blog.csdn.net/qq_37608890,本文来自笔者于2018年02月21日 23:10:04所撰写内容(http://blog.c ...
- R︱Softmax Regression建模 (MNIST 手写体识别和文档多分类应用)
本文转载自经管之家论坛, R语言中的Softmax Regression建模 (MNIST 手写体识别和文档多分类应用) R中的softmaxreg包,发自2016-09-09,链接:https:// ...
- TensorFlow(2)Softmax Regression
Softmax Regression Chapter Basics generate random Tensors Three usual activation function in Neural ...
- 逻辑回归与神经网络还有Softmax regression的关系与区别
本文讨论的关键词:Logistic Regression(逻辑回归).Neural Networks(神经网络) 之前在学习LR和NN的时候,一直对它们独立学习思考,就简单当做是机器学习中的两个不同的 ...
- 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 5:Softmax Regression
Softmax Regression Tutorial地址:http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/ 从本节開始 ...
- 2.1、Softmax Regression模型
Softmax Regression模型 由于Logistics Regression算法复杂度低,容易实现等特点,在工业中的到广泛的使用,但是Logistics Regression算法主要用于处理 ...
- 基于MNIST数据的softmax regression
跟着tensorflow上mnist基本机器学习教程联系 首先了解sklearn接口: sklearn.linear_model.LogisticRegression In the multiclas ...
随机推荐
- windows 下QT5.5+vs2013开发环境搭建
开发环境搭建: 1.下载QT,下载的网址如下: http://download.qt.io/official_releases/vsaddin/ http://download.qt.io/offic ...
- JS那些事儿——Gulp的入门使用
前言 新人使用gulp的一个记录. 首先对于第一个新事物,我会问gulp这是什么? 答:gulp是一个自动化构建工具,它可以做一些自动化的任务,比如: 检查Javascript 编译Sass(或Les ...
- UNIDAC不能识别CLIENTDATASET的TSINGLEFIELD
UNIDAC不能识别CLIENTDATASET的TSINGLEFIELD FIREDAC,UNIDAC这些通用的数据引擎,对某种数据库的支持,细节方面总有BUG. UNIDAC6.2.8发现不能识别C ...
- 带GPG签名的Git tag
原文地址http://airk000.github.io/git/2013/09/30/git-tag-with-gpg-key Git tag ###Tag用来做什么? Tag即标签,用以给项目仓储 ...
- 微软开源 Try .NET - 创建交互式.NET文档
微软近日开源了一个新平台--Try .NET,该平台可以让开发者在线上编写并运行 .NET 代码.微软介绍,Try .NET 是一个可嵌入的代码运行器,不仅可以直接在线上对自己或者他人的代码进行编辑. ...
- 手把手编写自己的PHPMVC框架
1 什么是MVC MVC模式(Model-View-Controller)是软件工程中的一种软件架构模式,把软件系统分为三个基本部分:模型(Model).视图(View)和控制器(Controller ...
- Java Unit Testing - JUnit & TestNG
转自https://www3.ntu.edu.sg/home/ehchua/programming/java/JavaUnitTesting.html yet another insignifican ...
- Python开发【迭代器】
1.迭代器 1.1.迭代器创建:指定数据创建迭代器(使用iter()和next() ) x = [1, 2, 3] #定义一个列表:<class 'list'> y = iter(x) # ...
- HDU 6044 Limited Permutation 读入挂+组合数学
Limited Permutation Problem Description As to a permutation p1,p2,⋯,pn from 1 to n, it is uncomplica ...
- Class.forName("java.lang.String")的作用?
返回字节码: 返回的方式有2种: 第一种是这个类的字节码已经加载到内存里面来了,现在想要取到它的字节码,我直接找到那份字节码把他返回: 第二种是我去得到这个类的字节码,结果在虚拟机里面还没有这个类的字 ...