Logistic Regression 用于预测马是否生病
1.利用Logistic regression 进行分类的主要思想
根据现有数据对分类边界线建立回归公式,即寻找最佳拟合参数集,然后进行分类。
2.利用梯度下降找出最佳拟合参数
3.代码实现
# -*- coding: utf-8 -*-
"""
Created on Tue Mar 28 21:35:25 2017 @author: MyHome
"""
import numpy as np
from random import uniform
'''定义sigmoid函数'''
def sigmoid(inX):
return 1.0 /(1.0 +np.exp(-inX)) '''使用随机梯度下降更新权重,并返回最终值'''
def StocGradientDescent(dataMatrix,classLabels,numIter = 600):
m,n = dataMatrix.shape
#print m,n
weights = np.ones(n)
for j in xrange(numIter):
dataIndex = range(m) for i in xrange(m): alpha = 4 / (1.0+j+i) + 0.01
randIndex = int(uniform(0,len(dataIndex)))
h = sigmoid(sum(dataMatrix[randIndex]*weights))
gradient = (h - classLabels[randIndex])*dataMatrix[randIndex]
weights = weights - alpha*gradient
del(dataIndex[randIndex]) return weights '''创建分类器'''
def classifyVector(inX,weights):
prob = sigmoid(sum(inX*weights))
if prob > 0.5:
return 1.0
else:
return 0.0 '''测试'''
def Test(): frTrain = open("horseColicTraining.txt")
frTest = open("horseColicTest.txt")
trainingSet = []
trainingLabel = []
for line in frTrain.readlines():
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
trainingSet.append(lineArr)
trainingLabel.append(float(currLine[21]))
trainWeights = StocGradientDescent(np.array(trainingSet),trainingLabel)
errorCount = 0.0
numTestVec = 0.0
for line in frTest.readlines():
numTestVec += 1.0
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
if int(classifyVector(np.array(lineArr),trainWeights)) != int(currLine[21]):
errorCount += 1
errorRate = (float(errorCount)/numTestVec)
print "the error rate of this test is:%f"%errorRate
return errorRate '''调用Test()10次求平均值'''
def multiTest():
numTest = 10
errorSum = 0.0
for k in range(numTest):
errorSum += Test()
print "after %d iterations the average errror rate is:\
%f"%(numTest,errorSum/float(numTest)) if __name__ == "__main__":
multiTest()
结果:
the error rate of this test is:0.522388
the error rate of this test is:0.328358
the error rate of this test is:0.313433
the error rate of this test is:0.358209
the error rate of this test is:0.298507
the error rate of this test is:0.343284
the error rate of this test is:0.283582
the error rate of this test is:0.313433
the error rate of this test is:0.343284
the error rate of this test is:0.358209
after 10 iterations the average errror rate is: 0.346269
4.总结
Logistic regression is finding best-fit parameters to a nonlinear function called the sigmoid.
Methods of optimization can be used to find the best-fit parameters. Among theoptimization algorithms, one of the most common algorithms is gradient descent. Gradient
desent can be simplified with stochastic gradient descent.
Stochastic gradient descent can do as well as gradient descent using far fewer computing
resources. In addition, stochastic gradient descent is an online algorithm; it can
update what it has learned as new data comes in rather than reloading all of the data
as in batch processing.
One major problem in machine learning is how to deal with missing values in the
data. There’s no blanket answer to this question. It really depends on what you’re
doing with the data. There are a number of solutions, and each solution has its own
advantages and disadvantages.
Logistic Regression 用于预测马是否生病的更多相关文章
- Logistic回归应用-预测马的死亡率
Logistic回归应用-预测马的死亡率 本文所有代码均来自<机器学习实战>,数据也是 本例中的数据有以下几个特征: 部分指标比较主观.难以很好的定量测量,例如马的疼痛级别 数据集中有30 ...
- matlab(8) Regularized logistic regression : 不同的λ(0,1,10,100)值对regularization的影响,对应不同的decision boundary\ 预测新的值和计算模型的精度predict.m
不同的λ(0,1,10,100)值对regularization的影响\ 预测新的值和计算模型的精度 %% ============= Part 2: Regularization and Accur ...
- Machine Learning - 第3周(Logistic Regression、Regularization)
Logistic regression is a method for classifying data into discrete outcomes. For example, we might u ...
- Coursera公开课笔记: 斯坦福大学机器学习第六课“逻辑回归(Logistic Regression)” 清晰讲解logistic-good!!!!!!
原文:http://52opencourse.com/125/coursera%E5%85%AC%E5%BC%80%E8%AF%BE%E7%AC%94%E8%AE%B0-%E6%96%AF%E5%9D ...
- 机器学习理论基础学习3.3--- Linear classification 线性分类之logistic regression(基于经验风险最小化)
一.逻辑回归是什么? 1.逻辑回归 逻辑回归假设数据服从伯努利分布,通过极大化似然函数的方法,运用梯度下降来求解参数,来达到将数据二分类的目的. logistic回归也称为逻辑回归,与线性回归这样输出 ...
- SparkMLlib之 logistic regression源码分析
最近在研究机器学习,使用的工具是spark,本文是针对spar最新的源码Spark1.6.0的MLlib中的logistic regression, linear regression进行源码分析,其 ...
- Logistic Regression Vs Decision Trees Vs SVM: Part I
Classification is one of the major problems that we solve while working on standard business problem ...
- Logistic Regression逻辑回归
参考自: http://blog.sina.com.cn/s/blog_74cf26810100ypzf.html http://blog.sina.com.cn/s/blog_64ecfc2f010 ...
- 在opencv3中实现机器学习之:利用逻辑斯谛回归(logistic regression)分类
logistic regression,注意这个单词logistic ,并不是逻辑(logic)的意思,音译过来应该是逻辑斯谛回归,或者直接叫logistic回归,并不是什么逻辑回归.大部分人都叫成逻 ...
随机推荐
- spring源码学习之:springAOP实现底层原理
一:springAOP底层实现是基于动态代理实现的.增强和切面,以及通知.是在动态代理生成的代理类inoke方法中调用实现 //+++++++++++++aop动态代理++++++++++++++++ ...
- win10 svn server安装过程中到starting service时失败
当安装到Start Service阶段后,将会出错并会弹出一个对话框,提示是否要retry. (此时SVN服务软件已经安装完毕,仅仅是无法通过证书验证,无法启动服务),如果此时选择对话框中的cance ...
- Cocoapods的操作
1.卸载Cocoapods命令: sudo gem uninstall cocoapods 2.安装Cocoapods的命令: >安装最新版本 sudo gem install cocoapod ...
- PHP 16 个编程法则
HP是最好的编程语言.对于PHP开发者来说,掌握一些编程法则是十分重要的.而在PHP中,以双下划线(__)开头的方法称为魔术方法,它们扮演着非常重要的角色. 常用的魔术方法包括: -__constru ...
- Maven引入jar包中的配置文件未被识别
我用的办法是直接将jar包中的配置文件复制出来,粘贴到我自己项目中的配置文件中,讯飞语音的jar包就有这种情况.
- PHP封装验证类
<?php /** * Created by PhpStorm. * User: jiqing * Date: 18-7-24 * Time: 下午4:36 * 常用验证 */ class Va ...
- PHP中的逻辑判断函数empty() isset() is_null() ==NULL ===NULL
1.empty() header("Content-type: text/html; charset=utf-8"); if(!empty($data)){ //empty() 未 ...
- STM32使用printf丢失第一个字母的问题
STM32使用printf函数给串口打印信息的执行步骤为: 1.重定向printf函数 给uart.c文件中增加如下函数: //重定向c库函数printf到USART1 int fputc(int c ...
- C# winform程序免安装.net framework在XP/win7/win10环境运行!(转)
C# winform程序免安装.net framework在XP/win7/win10环境运行! 前文: 首先感谢群里的大神宇内流云 提供的anyexec for windows版本. 经过本人搭 ...
- mongoDB的了解
一.什么是mongoDB? 1.MongoDB 是由C++语言编写的,是一个基于分布式文件存储的开源数据库系统.在高负载的情况下,添加更多的节点,可以保证服务器性能. 2.MongoDB 旨在为WEB ...