matlab(7) Regularized logistic regression : mapFeature(将feature增多) and costFunctionReg
Regularized logistic regression : mapFeature(将feature增多) and costFunctionReg
ex2_reg.m文件中的部分内容
%% =========== Part 1: Regularized Logistic Regression ============
% In this part, you are given a dataset with data points that are not
% linearly separable. However, you would still like to use logistic
% regression to classify the data points.
%
% To do so, you introduce more features to use -- in particular, you add
% polynomial features to our data matrix (similar to polynomial
% regression).
%
% Add Polynomial Features
% Note that mapFeature also adds a column of ones for us, so the intercept
% term is handled
X = mapFeature(X(:,1), X(:,2)); %调用下面的mapFeature.m文件中的mapFeature(X1,X2)函数
%将只有x1,x2feature map成一个有28个feature的6次的多项式
,这样就能画出更复杂的decision boundary, 但同时也有可能带来overfitting的结果(取决于λ的值)
% 调用完后X变为118*28(118个example,28个属性,包括前面的1做为一列)的矩阵
% Initialize fitting parameters
initial_theta = zeros(size(X, 2), 1); %initial_theta: 28*1
% Set regularization parameter lambda to 1
lambda = 1; % λ=1;当λ=0时表示不正则化(No regularization ),这时会出现overfitting;当λ=100时会出现Too much regularization(Underfitting)
% Compute and display initial cost and gradient for regularized logistic
% regression
[cost, grad] = costFunctionReg(initial_theta, X, y, lambda); %调用costFunctionReg.m文件中的costFunctionReg(theta, X, y, lambda)函数
fprintf('Cost at initial theta (zeros): %f\n', cost); %计算initial theta (zeros)时的cost 值
fprintf('\nProgram paused. Press enter to continue.\n');
pause;
mapFeature.m文件
function out = mapFeature(X1, X2)
% MAPFEATURE Feature mapping function to polynomial features
%
% MAPFEATURE(X1, X2) maps the two input features
% to quadratic features used in the regularization exercise.
%
% Returns a new feature array with more features, comprising of
% X1, X2, X1.^2, X2.^2, X1*X2, X1*X2.^2, etc..
%
% Inputs X1, X2 must be the same size
%
degree = 6; %map the features into all polynomial terms of x1 and x2 up to the sixth power
out = ones(size(X1(:,1)));
for i = 1:degree
for j = 0:i
out(:, end+1) = (X1.^(i-j)).*(X2.^j);
end
end
end
costFunctionReg.m文件
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
J = 1/m*(-1*y'*log(sigmoid(X*theta)) - (ones(1,m)-y')*log(ones(m,1)-sigmoid(X*theta)))...
+ lambda/(2*m) * (theta(2:end,:))' * theta(2:end,:); %Note that you should not regularize the parameter θ0.
the regularized cost function,
grad = 1/m * (X' * (sigmoid(X*theta) - y)) + (lambda/m)*theta; % 
grad(1) = 1/m * (X(:,1))' * (sigmoid(X*theta) - y); %
% Note that you should not regularize the parameter θ0.
% =============================================================
end
matlab(7) Regularized logistic regression : mapFeature(将feature增多) and costFunctionReg的更多相关文章
- matlab(6) Regularized logistic regression : plot data(画样本图)
Regularized logistic regression : plot data(画样本图) ex2data2.txt 0.051267,0.69956,1-0.092742,0.68494, ...
- matlab(8) Regularized logistic regression : 不同的λ(0,1,10,100)值对regularization的影响,对应不同的decision boundary\ 预测新的值和计算模型的精度predict.m
不同的λ(0,1,10,100)值对regularization的影响\ 预测新的值和计算模型的精度 %% ============= Part 2: Regularization and Accur ...
- machine learning(15) --Regularization:Regularized logistic regression
Regularization:Regularized logistic regression without regularization 当features很多时会出现overfitting现象,图 ...
- Regularized logistic regression
要解决的问题是,给出了具有2个特征的一堆训练数据集,从该数据的分布可以看出它们并不是非常线性可分的,因此很有必要用更高阶的特征来模拟.例如本程序中个就用到了特征值的6次方来求解. Data To be ...
- 编程作业2.2:Regularized Logistic regression
题目 在本部分的练习中,您将使用正则化的Logistic回归模型来预测一个制造工厂的微芯片是否通过质量保证(QA),在QA过程中,每个芯片都会经过各种测试来保证它可以正常运行.假设你是这个工厂的产品经 ...
- 吴恩达机器学习笔记22-正则化逻辑回归模型(Regularized Logistic Regression)
针对逻辑回归问题,我们在之前的课程已经学习过两种优化算法:我们首先学习了使用梯度下降法来优化代价函数
- Stanford机器学习---第三讲. 逻辑回归和过拟合问题的解决 logistic Regression & Regularization
原文:http://blog.csdn.net/abcjennifer/article/details/7716281 本栏目(Machine learning)包括单参数的线性回归.多参数的线性回归 ...
- Andrew Ng机器学习编程作业:Logistic Regression
编程作业文件: machine-learning-ex2 1. Logistic Regression (逻辑回归) 有之前学生的数据,建立逻辑回归模型预测,根据两次考试结果预测一个学生是否有资格被大 ...
- ML 逻辑回归 Logistic Regression
逻辑回归 Logistic Regression 1 分类 Classification 首先我们来看看使用线性回归来解决分类会出现的问题.下图中,我们加入了一个训练集,产生的新的假设函数使得我们进行 ...
随机推荐
- requests-html调用浏览器内核界面化源码改动
在实例化HTMLSession时传入参数:headless=False即可在r.html.render()时显示界面化的浏览器
- Python15之字符串的格式语句与操作符
一.字符串的format()函数 字符串1.format(赋值) 字符串中必须表明需要格式化的位置 format()函数使用时,花括号中的值表明字符串中 ...
- 别再裸奔了,你的项目代码安全吗,再不加密就out了
在工作中,有时候我们需要部署自己的Python代码 或进行私有化部署时,尤其现在都是通过docker镜像部署,我们并不希望别人能够看到自己的Python源程序. 加密Python源代码的方式,是将.p ...
- 利用Python进行数据分析 第8章 数据规整:聚合、合并和重塑.md
学习时间:2019/11/03 周日晚上23点半开始,计划1110学完 学习目标:Page218-249,共32页:目标6天学完(按每页20min.每天1小时/每天3页,需10天) 实际反馈:实际XX ...
- mininet安装配置
mininet安装配置 安装mininet mininet使用 在VM中运行mininet 安装VMware,在VMware中打开下载好的mininet虚拟机映像 启动虚拟机,虚拟机的初始账号密码均为 ...
- windows下使用linux terminal
windows下使用linux terminal 1.下载安装包 2.安装 3.解决乱码 0.前言 其实,写这个的目的是怕自己忘了,方便以后配置和分享 1.下载安装包 安装包下载地址: http:// ...
- 封装函数(累计和、K型、金字塔)
// 假设有个函数,只要传参数进去,就能统计累加的结果 function test($n){ if($n==1){ return 1; } return $n+test($n-1);}echo tes ...
- 用chattr命令防止系统中某个关键文件被修改
用chattr命令防止系统中某个关键文件被修改:# chattr +i /etc/resolv.conf
- JAVA实现种子填充算法
种子填充算法原理在网上很多地方都能找到,这篇是继上篇扫描线算法后另一种填充算法,直接上实现代码啦0.0 我的实现只是实现了种子填充算法,但是运行效率不快,如果大佬有改进方法,欢迎和我交流,谢谢! 最后 ...
- 基于【 bug解决】一 || mysql的ONLY_FULL_GROUP_BY导致的sql语句错误
一.Mysql错误: In aggregated query without GROUP BY, expression #1 of SELECT list contains nonaggregated ...