辅助函数

牛顿法介绍

 %% Logistic Regression
close all
clear %%load data
x = load('ex4x.dat');
y = load('ex4y.dat'); [m, n] = size(x); % Add intercept term to x
x = [ones(m, ), x]; %%draw picture
% find returns the indices of the
% rows meeting the specified condition
pos = find(y == );
neg = find(y == );
% Assume the features are in the 2nd and 3rd
% columns of x
figure('NumberTitle', 'off', 'Name', 'GD');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); % Define the sigmoid function
g = inline('1 ./ (1 + exp(-z))'); alpha = 0.001;
theta = [-,,]';
obj_old = 1e10;
tor = 1e-; tic %%Gradient Descent
for time = :
delta = zeros(,);
objective = ; for i = :
z = x(i,:) * theta;
h = g(z);%转换成logistic函数
delta = (/m) .* x(i,:)' * (y(i)-h) + delta;
objective = (/m) .*( -y(i) * log(h) - (-y(i)) * log(-h)) + objective;
end
theta = theta + alpha * delta; fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
pause();
%%SGD figure('NumberTitle', 'off', 'Name', 'SGD');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); alpha = 0.001;
theta = [-,,]';
obj_old = 1e10;
tor = 1e-;
k=;
U=ceil(m/k); for time = :
delta = zeros(,);
rand('twister',time*);
idx=randperm(m);
objective = ; subidx=idx(:k);
for i=:length(subidx)
z = x(subidx(i),:) * theta;
h = g(z);%转换成logistic函数
delta = (/k) .* x(subidx(i),:)' * (y(subidx(i))-h) + delta;
objective = (/k) .*( -y(subidx(i)) * log(h) - (-y(subidx(i))) * log(-h)) + objective;
end
theta = theta + alpha * delta; fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
pause() %%Newton's method figure('NumberTitle', 'off', 'Name', 'Newton');
plot(x(pos, ), x(pos,), '+');
hold on;
plot(x(neg, ), x(neg, ), 'o'); alpha = 0.001;
theta = zeros(, );
obj_old = 1e10;
tor = 1e-; for i = :
delta = zeros(,);
delta_H = zeros(,);
objective = ;
% Calculate the hypothesis function
for i = :
z = x(i,:) * theta;
h = g(z);%转换成logistic函数
delta = (/m) .* x(i,:)' * (h-y(i)) + delta;
delta_H = (/m).* x(i,:)' * h * (1-h) * x(i,:) + delta_H;
objective = (/m) .*( -y(i) * log(h) - (-y(i)) * log(-h)) + objective;
end
theta = theta - delta_H\delta;
fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta()).*(theta().*plot_x +theta());
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off
toc
 %% Softmax Regression
close all
clear %%load data
load('my_ex4x.mat');
load('my_ex4y.mat'); [m, n] = size(x); % Add intercept term to x
x = [ones(m, ), x];
y = y + ; class_num = max(y);
n = n + ; %%draw picture
% find returns the indices of the
% rows meeting the specified condition
class2 = find(y == );
class1 = find(y == );
class3 = find(y == );
% Assume the features are in the 2nd and 3rd
% columns of x
figure('NumberTitle', 'off', 'Name', 'GD');
plot(x(class2, ), x(class2,), '+');
hold on;
plot(x(class1, ), x(class1, ), 'o');
hold on;
plot(x(class3, ), x(class3, ), '*');
hold on; % Define the sigmoid function
g = inline('exp(z) ./ sumz','z','sumz'); alpha = 0.0001;
theta = [-,0.15,0.14;-,,-]';
obj_old = 1e10;
tor = 1e-; %%Gradient Descent
for time = :
delta = zeros(,);
objective = ; for i = :
for j = :
z = x(i,:) * theta(:,j);
sumz = exp(x(i,:) * theta(:,)) + exp(x(i,:) * theta(:,)) + ;
h = g(z,sumz);%转换成logistic函数
if y(i)==j
delta = (/m) .* x(i,:)' * (1-h);
theta(:,j) = theta(:,j) + alpha * delta;
objective = (/m) .*(-y(i) * log(h)) + objective;
else
delta = (/m) .* x(i,:)' * (-h);
theta(:,j) = theta(:,j) + alpha * delta;
objective = (/m) .*(-(-y(i)) * log(-h)) + objective;
end
end
end fprintf('objective is %.4f\n', objective);
if abs(obj_old - objective) < tor
fprintf('torlerance is samller than %.4f\n', tor);
break;
end
obj_old = objective;
end %%Calculate the decision boundary line
plot_x = [min(x(:,)), max(x(:,))];
plot_y = (-./theta(,)).*(theta(,).*plot_x +theta(,));
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold on plot_y = (-./theta(,)).*(theta(,).*plot_x +theta(,));
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off

Logistic/Softmax Regression的更多相关文章

  1. 机器学习方法(五):逻辑回归Logistic Regression,Softmax Regression

    欢迎转载,转载请注明:本文出自Bin的专栏blog.csdn.net/xbinworld. 技术交流QQ群:433250724,欢迎对算法.技术.应用感兴趣的同学加入. 前面介绍过线性回归的基本知识, ...

  2. Softmax回归(Softmax Regression)

    转载请注明出处:http://www.cnblogs.com/BYRans/ 多分类问题 在一个多分类问题中,因变量y有k个取值,即.例如在邮件分类问题中,我们要把邮件分为垃圾邮件.个人邮件.工作邮件 ...

  3. TensorFlow实战之Softmax Regression识别手写数字

         关于本文说明,本人原博客地址位于http://blog.csdn.net/qq_37608890,本文来自笔者于2018年02月21日 23:10:04所撰写内容(http://blog.c ...

  4. R︱Softmax Regression建模 (MNIST 手写体识别和文档多分类应用)

    本文转载自经管之家论坛, R语言中的Softmax Regression建模 (MNIST 手写体识别和文档多分类应用) R中的softmaxreg包,发自2016-09-09,链接:https:// ...

  5. TensorFlow(2)Softmax Regression

    Softmax Regression Chapter Basics generate random Tensors Three usual activation function in Neural ...

  6. 逻辑回归与神经网络还有Softmax regression的关系与区别

    本文讨论的关键词:Logistic Regression(逻辑回归).Neural Networks(神经网络) 之前在学习LR和NN的时候,一直对它们独立学习思考,就简单当做是机器学习中的两个不同的 ...

  7. 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 5:Softmax Regression

    Softmax Regression Tutorial地址:http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/ 从本节開始 ...

  8. 2.1、Softmax Regression模型

    Softmax Regression模型 由于Logistics Regression算法复杂度低,容易实现等特点,在工业中的到广泛的使用,但是Logistics Regression算法主要用于处理 ...

  9. 基于MNIST数据的softmax regression

    跟着tensorflow上mnist基本机器学习教程联系 首先了解sklearn接口: sklearn.linear_model.LogisticRegression In the multiclas ...

随机推荐

  1. PYTHON 源码

    http://www.wklken.me/index2.html http://blog.csdn.net/dbzhang800/article/details/6683440

  2. Donser Online Judge 完成运行使命~

    复试成功完成~ 2018年网研机考难度不大,仍然有些遗憾,前两题水题后两个题纯暴力 排行榜 排名 用户 题数 罚时 A B C D retest2018_INT246 (INT246) (+) (+) ...

  3. Go与C语言的互操作 cgo

    http://tonybai.com/2012/09/26/interoperability-between-go-and-c/ // foo.h int count; void foo(); //f ...

  4. python实现的一个简单的网页爬虫

    学习了下python,看了一个简单的网页爬虫:http://www.cnblogs.com/fnng/p/3576154.html 自己实现了一个简单的网页爬虫,获取豆瓣的最新电影信息. 爬虫主要是获 ...

  5. 民大OJ 1668 追杀系列第二发

    追杀系列第二发 时间限制(普通/Java) : 1000 MS/ 3000 MS          运行内存限制 : 65536 KByte 总提交 : 57            测试通过 : 16 ...

  6. 创建一个zookeeper的会话(实现watcher)

    在先前的章节中,我们利用zkCli去了解了一下主要的zookeeper的操作.在接下来的章节中,我们将会学习一下在应用中是怎样利用zookeeper的api的.接下来我们将利用一个程序展示一下,怎样来 ...

  7. Zip加密解密

    Zip加密解密方法: 1.winzipaes http://blog.csdn.net/zhyh1986/article/details/7724229 2.zip4j http://blog.csd ...

  8. putty software caused connection abort

    错误现象:在非常短的时间内就失去连接.并报"Software caused connection abort" 解决的方法:首先得排除是网络不是不通畅.假设在局域网中要确定IP没有 ...

  9. VUE 之 vuex 和 axios

    1.Vuex 部分 1.1 Vuex 是专门为vue.js设计的集中式状态管理架构,其实就是将数据存在一个store中,共各个组件共享使用 1.2 配置: 1.2.1 下载:--npm install ...

  10. A nonrecursive list compacting algorithm

    A nonrecursive list compacting algorithm Each Erlang process has its own stack and heap which are al ...