[OpenCV] Samples 06: [ML] logistic regression
logistic regression,这个算法只能解决简单的线性二分类,在众多的机器学习分类算法中并不出众,但它能被改进为多分类,并换了另外一个名字softmax, 这可是深度学习中响当当的分类算法。
Reference: denny的学习专栏 // 臭味相投的一个博客
- Xml保存图片的方法和读取的方式。
- Mat显示内部的多个图片。
- Mat::t() 显示矩阵内容。
本文用它来进行手写数字分类。
在opencv3.0中提供了一个xml文件,里面存放了40个样本,分别是20个数字0的手写体和20个数字1的手写体。本来每个数字的手写体是一张28*28的小图片,在xml使用1*784 的向量保存在<data>中。
这个文件的位置: \opencv\sources\samples\data\data01.xml

/*//////////////////////////////////////////////////////////////////////////////////////
// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING. // By downloading, copying, installing or using the software you agree to this license.
// If you do not agree to this license, do not download, install,
// copy or use the software. // This is a implementation of the Logistic Regression algorithm in C++ in OpenCV. // AUTHOR:
// Rahul Kavi rahulkavi[at]live[at]com
// // contains a subset of data from the popular Iris Dataset (taken from
// "http://archive.ics.uci.edu/ml/datasets/Iris") // # You are free to use, change, or redistribute the code in any way you wish for
// # non-commercial purposes, but please maintain the name of the original author.
// # This code comes with no warranty of any kind. // #
// # You are free to use, change, or redistribute the code in any way you wish for
// # non-commercial purposes, but please maintain the name of the original author.
// # This code comes with no warranty of any kind. // # Logistic Regression ALGORITHM // License Agreement
// For Open Source Computer Vision Library // Copyright (C) 2000-2008, Intel Corporation, all rights reserved.
// Copyright (C) 2008-2011, Willow Garage Inc., all rights reserved.
// Third party copyrights are property of their respective owners. // Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met: // * Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer. // * Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution. // * The name of the copyright holders may not be used to endorse or promote products
// derived from this software without specific prior written permission. // This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
// the use of this software, even if advised of the possibility of such damage.*/ #include <iostream> #include <opencv2/core.hpp>
#include <opencv2/ml.hpp>
#include <opencv2/highgui.hpp> using namespace std;
using namespace cv;
using namespace cv::ml; /*
* Jeff --> Show mutiple-photos from Mat.
*/
static void showImage(const Mat &data, int columns, const String &name)
{
// columns = 28
Mat bigImage;
for(int i = 0; i < data.rows; ++i)
{
//rows: number of photos.
// vector --> reshape --> col 28, col 28 ...
// push_back: show each pic from left to right.
bigImage.push_back(data.row(i).reshape(0, columns)); }
imshow(name, bigImage.t());
} static float calculateAccuracyPercent(const Mat &original, const Mat &predicted)
{
return 100 * (float)countNonZero(original == predicted) / predicted.rows;
} int main()
{
const String filename = "../data/data01.xml";
cout << "**********************************************************************" << endl;
cout << filename
<< " contains digits 0 and 1 of 20 samples each, collected on an Android device" << endl;
cout << "Each of the collected images are of size 28 x 28 re-arranged to 1 x 784 matrix"
<< endl;
cout << "**********************************************************************" << endl; Mat data, labels;
{
/*
* Jeff --> Load xml.
* transform to Mat.
* FileStorage.
*/
cout << "loading the dataset...";
// Step 1.
FileStorage f;
if(f.open(filename, FileStorage::READ))
{
// Step 2.
f["datamat"] >> data;
f["labelsmat"] >> labels;
f.release();
}
else
{
cerr << "file can not be opened: " << filename << endl;
return 1;
}
// Step 3.
data.convertTo(data, CV_32F);
labels.convertTo(labels, CV_32F); cout << "read " << data.rows << " rows of data" << endl;
} Mat data_train, data_test;
Mat labels_train, labels_test;
for(int i = 0; i < data.rows; i++)
{
// Step 4.
if(i % 2 == 0)
{
data_train.push_back(data.row(i));
labels_train.push_back(labels.row(i));
}
else
{
data_test.push_back(data.row(i));
labels_test.push_back(labels.row(i));
}
}
cout << "training/testing samples count: " << data_train.rows << "/" << data_test.rows << endl; // display sample image
showImage(data_train, 28, "train data");
showImage(data_test, 28, "test data"); /**************************************************************************/ // simple case with batch gradient
cout << "training..."; // Step (1), create classifier.
Ptr<LogisticRegression> lr1 = LogisticRegression::create(); // Step (2),
lr1->setLearningRate(0.001);
lr1->setIterations(10);
lr1->setRegularization(LogisticRegression::REG_L2);
lr1->setTrainMethod(LogisticRegression::BATCH);
lr1->setMiniBatchSize(1); // Step (3), train.
//! [init]
lr1->train(data_train, ROW_SAMPLE, labels_train);
cout << "done!" << endl; //-------------------------------------------------------------------------- cout << "predicting..."; // Step (4), predict.
Mat responses;
lr1->predict(data_test, responses);
cout << "done!" << endl; // Step (5), show prediction report
cout << "original vs predicted:" << endl;
// Jeff --> CV_32S is a signed 32bit integer value for each pixel.
labels_test.convertTo(labels_test, CV_32S); cout << labels_test.t() << endl;
cout << responses.t() << endl;
cout << "accuracy: " << calculateAccuracyPercent(labels_test, responses) << "%" << endl; // Step (6), save the classfier
const String saveFilename = "NewLR_Trained.xml";
cout << "saving the classifier to " << saveFilename << endl;
lr1->save(saveFilename); /****************************** End ***************************************/ // load the classifier onto new object
cout << "loading a new classifier from " << saveFilename << endl;
Ptr<LogisticRegression> lr2 = StatModel::load<LogisticRegression>(saveFilename); // predict using loaded classifier
cout << "predicting the dataset using the loaded classfier...";
Mat responses2;
lr2->predict(data_test, responses2);
cout << "done!" << endl; // calculate accuracy
cout << labels_test.t() << endl;
cout << responses2.t() << endl;
cout << "accuracy: " << calculateAccuracyPercent(labels_test, responses2) << "%" << endl; waitKey(0);
return 0;
}
关于逻辑回归:http://blog.csdn.net/pakko/article/details/37878837
什么是逻辑回归?
Logistic回归与多重线性回归实际上有很多相同之处,最大的区别就在于它们的因变量不同,其他的基本都差不多。正是因为如此,这两种回归可以归于同一个家族,即广义线性模型(generalizedlinear model)。
这一家族中的模型形式基本上都差不多,不同的就是因变量不同。
- 如果是连续的,就是多重线性回归;
- 如果是二项分布,就是Logistic回归;
- 如果是Poisson分布,就是Poisson回归;
- 如果是负二项分布,就是负二项回归。
Logistic回归的因变量可以是二分类的,也可以是多分类的,但是二分类的更为常用,也更加容易解释。所以实际中最常用的就是二分类的Logistic回归。
Logistic回归的主要用途:
- 寻找危险因素:寻找某一疾病的危险因素等;
- 预测:根据模型,预测在不同的自变量情况下,发生某病或某种情况的概率有多大;
- 判别:实际上跟预测有些类似,也是根据模型,判断某人属于某病或属于某种情况的概率有多大,也就是看一下这个人有多大的可能性是属于某病。
Logistic回归主要在流行病学中应用较多,比较常用的情形是探索某疾病的危险因素,根据危险因素预测某疾病发生的概率,等等。例如,想探讨胃癌发生的危险因素,可以选择两组人群,一组是胃癌组,一组是非胃癌组,两组人群肯定有不同的体征和生活方式等。这里的因变量就是是否胃癌,即“是”或“否”,自变量就可以包括很多了,例如年龄、性别、饮食习惯、幽门螺杆菌感染等。自变量既可以是连续的,也可以是分类的。
常规步骤
Regression问题的常规步骤为:
- 寻找h函数(即hypothesis); ==> Sigmoid函数
- 构造J函数(loss函数);
- 想办法使得J函数最小并求得回归参数(θ)
详见reference博客。
[OpenCV] Samples 06: [ML] logistic regression的更多相关文章
- [OpenCV] Samples 06: logistic regression
logistic regression,这个算法只能解决简单的线性二分类,在众多的机器学习分类算法中并不出众,但它能被改进为多分类,并换了另外一个名字softmax, 这可是深度学习中响当当的分类算法 ...
- [OpenCV] Samples 02: [ML] kmeans
注意Mat作为kmeans的参数的含义. 扩展:高维向量的聚类. #include "opencv2/highgui.hpp" #include "opencv2/cor ...
- [OpenCV] Samples 10: imagelist_creator
yaml写法的简单例子.将 $ ./ 1 2 3 4 5 命令的参数(代表图片地址)写入yaml中. 写yaml文件. 参考:[OpenCV] Samples 06: [ML] logistic re ...
- ML 逻辑回归 Logistic Regression
逻辑回归 Logistic Regression 1 分类 Classification 首先我们来看看使用线性回归来解决分类会出现的问题.下图中,我们加入了一个训练集,产生的新的假设函数使得我们进行 ...
- [机器学习] Coursera ML笔记 - 逻辑回归(Logistic Regression)
引言 机器学习栏目记录我在学习Machine Learning过程的一些心得笔记,涵盖线性回归.逻辑回归.Softmax回归.神经网络和SVM等等.主要学习资料来自Standford Andrew N ...
- 在opencv3中实现机器学习之:利用逻辑斯谛回归(logistic regression)分类
logistic regression,注意这个单词logistic ,并不是逻辑(logic)的意思,音译过来应该是逻辑斯谛回归,或者直接叫logistic回归,并不是什么逻辑回归.大部分人都叫成逻 ...
- SAS PROC MCMC example in R: Logistic Regression Random-Effects Model(转)
In this post I will run SAS example Logistic Regression Random-Effects Model in four R based solutio ...
- [Machine Learning & Algorithm]CAML机器学习系列1:深入浅出ML之Regression家族
声明:本博客整理自博友@zhouyong计算广告与机器学习-技术共享平台,尊重原创,欢迎感兴趣的博友查看原文. 符号定义 这里定义<深入浅出ML>系列中涉及到的公式符号,如无特殊说明,符号 ...
- SparkMLlib之 logistic regression源码分析
最近在研究机器学习,使用的工具是spark,本文是针对spar最新的源码Spark1.6.0的MLlib中的logistic regression, linear regression进行源码分析,其 ...
随机推荐
- linux添加新LUN,无需重启
linux添加新LUN,无需重启 在给存储增加新的Lun时,在linux下一般是: A.重启操作系统B.重启HBA卡驱动 1. kudzu添加完新硬盘后,运行命令kudzu重新扫描新的硬件设备,类似a ...
- StoryBoard 简单使用
StoryBoard简单使用 故事版(storyboard)是一种简洁的图形界面,程序员可以采取拖的形式搭建一个界面,现在使用的xcode默认都会创建一个main.storyboard,作为app的入 ...
- JavaWeb开发学习(二)-配置Tomcat服务器
1. 下载Tomcat服务器 Tomcat是一个免费.开源的JavaWeb服务器. Tomcat官网是http://tomcat.apache.org/ 我使用的是版本是Tomcat7.0,下载Zip ...
- 【转】VC中的字符串处理
http://hi.baidu.com/nmn714/item/ab8d2a96d0f2d6f228164727 貌似不少人刚开始做windows程序时都会纠结在字符串处理上,所以我把关于字符串处理的 ...
- 《理解 ES6》阅读整理:函数(Functions)(七)Block-Level Functions
块级函数(Block-Level Functions) 在ES3及以前,在块内声明一个函数会报语法错误,但是所有的浏览器都支持块级函数.不幸的是,每个浏览器在支持块级函数方面都有一些细微的不同的行为. ...
- 一些Python的惯用法和小技巧:Pythonic
Pythonic其实是个模糊的含义,没有确定的解释.网上也没有过多关于Pythonic的说明,我个人的理解是更加Python,更符合Python的行为习惯.本文主要是说明一些Python的惯用法和小技 ...
- 2016,Raym
Hello,2016: 这是承上启下的一年! Raym
- 【原创】还原Hyper-V 到一个新的虚拟机
Context: I need to restore VM31 backup (via 'Windows Server Backup' tool) as another new Hyper-V mac ...
- 不写1行代码,在Mac上体验ASP.NET 5的最简单方法
昨天微软发布了ASP.NET 5 beta2(详见ASP.NET 5 Beta2 发布),对ASP.NET 5的好奇心又被激发了. 今天下午在Mac OS X上体验了一下ASP.NET 5,而且借助Y ...
- 细心很重要---猜猜这个SQL执行的什么意思
今天在帮客户做语句优化的时候,突然遇到这样一个语句,类似下面的例子(原语句是个update) 例子中使用AdventureWorks数据中的两个表. productID 是[Production].[ ...