1.利用Logistic regression 进行分类的主要思想

根据现有数据对分类边界线建立回归公式,即寻找最佳拟合参数集,然后进行分类。

2.利用梯度下降找出最佳拟合参数

3.代码实现

 # -*- coding: utf-8 -*-
"""
Created on Tue Mar 28 21:35:25 2017 @author: MyHome
"""
import numpy as np
from random import uniform
'''定义sigmoid函数'''
def sigmoid(inX):
return 1.0 /(1.0 +np.exp(-inX)) '''使用随机梯度下降更新权重,并返回最终值'''
def StocGradientDescent(dataMatrix,classLabels,numIter = 600):
m,n = dataMatrix.shape
#print m,n
weights = np.ones(n)
for j in xrange(numIter):
dataIndex = range(m) for i in xrange(m): alpha = 4 / (1.0+j+i) + 0.01
randIndex = int(uniform(0,len(dataIndex)))
h = sigmoid(sum(dataMatrix[randIndex]*weights))
gradient = (h - classLabels[randIndex])*dataMatrix[randIndex]
weights = weights - alpha*gradient
del(dataIndex[randIndex]) return weights '''创建分类器'''
def classifyVector(inX,weights):
prob = sigmoid(sum(inX*weights))
if prob > 0.5:
return 1.0
else:
return 0.0 '''测试'''
def Test(): frTrain = open("horseColicTraining.txt")
frTest = open("horseColicTest.txt")
trainingSet = []
trainingLabel = []
for line in frTrain.readlines():
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
trainingSet.append(lineArr)
trainingLabel.append(float(currLine[21]))
trainWeights = StocGradientDescent(np.array(trainingSet),trainingLabel)
errorCount = 0.0
numTestVec = 0.0
for line in frTest.readlines():
numTestVec += 1.0
currLine = line.strip().split("\t")
lineArr = []
for i in range(21):
lineArr.append(float(currLine[i]))
if int(classifyVector(np.array(lineArr),trainWeights)) != int(currLine[21]):
errorCount += 1
errorRate = (float(errorCount)/numTestVec)
print "the error rate of this test is:%f"%errorRate
return errorRate '''调用Test()10次求平均值'''
def multiTest():
numTest = 10
errorSum = 0.0
for k in range(numTest):
errorSum += Test()
print "after %d iterations the average errror rate is:\
%f"%(numTest,errorSum/float(numTest)) if __name__ == "__main__":
multiTest()

结果:

the error rate of this test is:0.522388
the error rate of this test is:0.328358

the error rate of this test is:0.313433

the error rate of this test is:0.358209

the error rate of this test is:0.298507

the error rate of this test is:0.343284

the error rate of this test is:0.283582

the error rate of this test is:0.313433

the error rate of this test is:0.343284

the error rate of this test is:0.358209

after 10 iterations the average errror rate is:        0.346269

4.总结

Logistic regression is finding best-fit parameters to a nonlinear function called the sigmoid.

Methods of optimization can be used to find the best-fit parameters. Among the

optimization algorithms, one of the most common algorithms is gradient descent. Gradient

desent can be simplified with stochastic gradient descent.

Stochastic gradient descent can do as well as gradient descent using far fewer computing

resources. In addition, stochastic gradient descent is an online algorithm; it can

update what it has learned as new data comes in rather than reloading all of the data

as in batch processing.

One major problem in machine learning is how to deal with missing values in the

data. There’s no blanket answer to this question. It really depends on what you’re

doing with the data. There are a number of solutions, and each solution has its own

advantages and disadvantages.

Logistic Regression 用于预测马是否生病的更多相关文章

  1. Logistic回归应用-预测马的死亡率

    Logistic回归应用-预测马的死亡率 本文所有代码均来自<机器学习实战>,数据也是 本例中的数据有以下几个特征: 部分指标比较主观.难以很好的定量测量,例如马的疼痛级别 数据集中有30 ...

  2. matlab(8) Regularized logistic regression : 不同的λ(0,1,10,100)值对regularization的影响,对应不同的decision boundary\ 预测新的值和计算模型的精度predict.m

    不同的λ(0,1,10,100)值对regularization的影响\ 预测新的值和计算模型的精度 %% ============= Part 2: Regularization and Accur ...

  3. Machine Learning - 第3周(Logistic Regression、Regularization)

    Logistic regression is a method for classifying data into discrete outcomes. For example, we might u ...

  4. Coursera公开课笔记: 斯坦福大学机器学习第六课“逻辑回归(Logistic Regression)” 清晰讲解logistic-good!!!!!!

    原文:http://52opencourse.com/125/coursera%E5%85%AC%E5%BC%80%E8%AF%BE%E7%AC%94%E8%AE%B0-%E6%96%AF%E5%9D ...

  5. 机器学习理论基础学习3.3--- Linear classification 线性分类之logistic regression(基于经验风险最小化)

    一.逻辑回归是什么? 1.逻辑回归 逻辑回归假设数据服从伯努利分布,通过极大化似然函数的方法,运用梯度下降来求解参数,来达到将数据二分类的目的. logistic回归也称为逻辑回归,与线性回归这样输出 ...

  6. SparkMLlib之 logistic regression源码分析

    最近在研究机器学习,使用的工具是spark,本文是针对spar最新的源码Spark1.6.0的MLlib中的logistic regression, linear regression进行源码分析,其 ...

  7. Logistic Regression Vs Decision Trees Vs SVM: Part I

    Classification is one of the major problems that we solve while working on standard business problem ...

  8. Logistic Regression逻辑回归

    参考自: http://blog.sina.com.cn/s/blog_74cf26810100ypzf.html http://blog.sina.com.cn/s/blog_64ecfc2f010 ...

  9. 在opencv3中实现机器学习之:利用逻辑斯谛回归(logistic regression)分类

    logistic regression,注意这个单词logistic ,并不是逻辑(logic)的意思,音译过来应该是逻辑斯谛回归,或者直接叫logistic回归,并不是什么逻辑回归.大部分人都叫成逻 ...

随机推荐

  1. SCARA——OpenGL入门学习一、二

    参考博客:http://www.cppblog.com/doing5552/archive/2009/01/08/71532.html 简介 最近开始一个机械手臂的安装调试,平面关节型机器人又称SCA ...

  2. H5打字机特效

    <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8&quo ...

  3. 使用pinyin4j汉字转pinyin

    引入maven依赖<dependencies> <dependency> <groupId>com.belerweb</groupId> <art ...

  4. android之Notification通知

    我们在用手机的时候,如果来了短信,而我们没有点击查看的话,是不是在手机的最上边的状态栏里有一个短信的小图标提示啊?你是不是也想实现这种功能呢?今天的Notification就是解决这个问题的. pac ...

  5. shell中字体变色

    在linux中给字体使用数字代码变色 字体颜色代码:重置0 ,黑色30,红色31,绿色32,黄色33,蓝色34,洋红35,青色36,浅灰37 效果代码:1m加粗  2m加下划线  5m闪动效果  7m ...

  6. mysql复制原理与机制一

    复制原理:复制需要二进制日志记录数据库上的改变 slave的IO线程复制把master上的Binary log读取到本地的relay log里SQL线程负责把relay log恢复到数据库数据里 sh ...

  7. shell变量(字符串)间的连接

    shell变量(字符串)间的连接 对于变量或者字符串的连接,shell提供了相当简单的做法,比string的连接还要直接. 直接放到一起或用双引号即可. 例如$a, $b,有 c=$a$b c=$a& ...

  8. cp -f 还是提示是否覆盖

    新做了服务器,cp覆盖时,无论加什么参数-f之类的还是提示是否覆盖,这在大量cp覆盖操作的时候是不能忍受的. 把a目录下的文件复制到b目录 以下是代码片段: cp –r a/* b 执行上面的命令时, ...

  9. JS面向对象编程,对象,属性,方法。

    document.write('<script type="text/javascript" src="http://api.map.baidu.com/api?v ...

  10. java NIO(转载)

    (原文地址:https://zhuanlan.zhihu.com/p/23488863) NIO(Non-blocking I/O,在Java领域,也称为New I/O),是一种同步非阻塞的I/O模型 ...