One layer SoftMax Classifier, "Handwriting recognition"
import lib needed¶
from PIL import Image
import numpy as np
import matplotlib.pyplot as plt
import re
from glob import glob
begin, load data¶
def load_data(train_path='train/',test_path='test/'):
train_list=glob(r'train/*.png')
pattern = re.compile(r'num(\d).png')
train_id = np.array([float(pattern.search(img_name).groups()[0]) for img_name in train_list])
train_data=np.concatenate([np.array(Image.open(img_name)).reshape(1,784) for img_name in train_list],axis=0).astype(np.float)
test_list=glob(r'test/*.png')
test_id=np.array([float(pattern.search(img_name).groups()[0]) for img_name in test_list])
test_data=np.concatenate([np.array(Image.open(img_name)).reshape(1,784) for img_name in test_list],axis=0).astype(np.float)
return train_id,train_data,test_id,test_data
load data, print the shape of data¶
train_id,train_data,test_id,test_data=load_data()
train_id.shape,train_data.shape,test_id.shape,test_data.shape
((60000,), (60000, 784), (10000,), (10000, 784))
train_val=np.zeros((train_id.shape[0],10))
for i in range(train_id.shape[0]):
train_val[i,train_id[i].astype('int')]=1
split data into minibatches¶
mini_batch_num=100
mini_batch_size=600
define function need, such as softmax, propagation,back_propagation¶
def softmax(x):
x=x-np.max(x) #using softmax(x)=softmax(x+c)
exp_x=np.exp(x)
softmax_x=exp_x/sum(np.exp(x))
return softmax_x
use cross entrophy to compute loss, this is part of propagation¶
def propa(train_x,train_y,W,b): #propagation
yt=softmax(np.dot(train_x,W)+b)
loss=-np.sum(train_y.T.dot(np.log(yt))) #cross entrophy
dy=(yt-train_y).T
return dy,loss
update W¶
def back_propa(train_data,train_id,W,b,alpha,data_size):
for i in range(data_size):
dy,loss=propa(train_data[i,:],train_id[i,:],W,b)
dy=dy.reshape(1,10)
p=train_data[i,:]
p=p.reshape(784,1)
dW=alpha*np.dot(p,dy)
W-=dW
return W,loss
initialize W and b¶
W=np.zeros((784,10))
b=1
loop and update, also print accurancy of our traindataset¶
for i in range(mini_batch_num):
for iteration in range(20):
lb=(mini_batch_size*i)
ub=(mini_batch_size*(i+1))
mini_batch_data=train_data[lb:ub,:]
mini_batch_id=train_val[lb:ub,:]
W,loss=back_propa(mini_batch_data,mini_batch_id,W,b,0.01,600)
count=0
for j in range(600):
if np.argmax(softmax(train_data[j,:].dot(W)))==train_id[j].astype('int'):
count+=1
acc=count/600
if i%10==0:
print('batch={},acc={}'.format(i+1,acc))
e:\Anaconda3\lib\site-packages\ipykernel_launcher.py:3: RuntimeWarning: divide by zero encountered in log
This is separate from the ipykernel package so we can avoid doing imports until
batch=1,acc=1.0
batch=11,acc=0.8833333333333333
batch=21,acc=0.865
batch=31,acc=0.8983333333333333
batch=41,acc=0.8766666666666667
batch=51,acc=0.8883333333333333
batch=61,acc=0.8733333333333333
batch=71,acc=0.845
batch=81,acc=0.89
batch=91,acc=0.8766666666666667
predict in the test dataset¶
for j in range(test_id.shape[0]):
if np.argmax(softmax(test_data[j,:].dot(W)))==test_id[j].astype('int'):
count+=1
acc=count/test_id.shape[0]
print(acc)
0.9103
One layer SoftMax Classifier, "Handwriting recognition"的更多相关文章
- Online handwriting recognition using multi convolution neural networks
w可以考虑从计算机的“机械性.重复性”特征去设计“低效的”算法. https://www.codeproject.com/articles/523074/webcontrols/ Online han ...
- 机器学习: Softmax Classifier (三个隐含层)
程序实现 softmax classifier, 含有三个隐含层的情况.activation function 是 ReLU : f(x)=max(0,x) f1=w1x+b1 h1=max(0,f1 ...
- 机器学习:Softmax Classifier (两个隐含层)
程序实现 softmax classifier, 含有两个隐含层的情况.activation function 是 ReLU : f(x)=max(0,x) f1=w1x+b1 h1=max(0,f1 ...
- [DeeplearningAI笔记]序列模型2.6Word2Vec/Skip-grams/hierarchical softmax classifier 分级softmax 分类器
5.2自然语言处理 觉得有用的话,欢迎一起讨论相互学习~Follow Me 2.6 Word2Vec Word2Vec相对于原先介绍的词嵌入的方法来说更加的简单快速. Mikolov T, Chen ...
- 机器学习 Softmax classifier (一个隐含层)
程序实现 softmax classifier, 含有一个隐含层的情况.activation function 是 ReLU : f(x)=max(0,x) f1=w1x+b1 h1=max(0,f1 ...
- 机器学习 Softmax classifier (无隐含层)
程序实现 Softmax classifer, 没有隐含层, f=wx+b y=efi∑jefj %% Softmax classifier function Out=Softmax_Classifi ...
- [转]csharp:Microsoft.Ink 手写识别(HandWriting Recognition)
原贴:http://www.cnblogs.com/geovindu/p/3702427.html 下載: //Microsoft Windows XP Tablet PC Edition 2005 ...
- csharp:Microsoft.Ink 手写识别(HandWriting Recognition)
/* 下載: //Microsoft Windows XP Tablet PC Edition 2005 Recognizer Pack http://www.microsoft.com/zh-cn/ ...
- Kernel Functions for Machine Learning Applications
In recent years, Kernel methods have received major attention, particularly due to the increased pop ...
随机推荐
- .NET敏感信息分离托管 娓娓道来
引言 互联网每隔一段时间就会爆出 [某程序猿在代码托管平台上传了公司机密配置信息,导致公司核心数据被黑客获取或修改], 一茬又一茬背锅侠层出不穷. 软件工程理论早以加粗字体给出 经典原则:Never ...
- CodeForces 590C Three States BFS
Three Statesy 题解: 以3个大陆为起点,都dfs一遍,求出该大陆到其他点的最小距离是多少, 然后枚举每个点作为3个大陆的路径交点. 代码: #include<bits/stdc++ ...
- codeforces 807 E. Prairie Partition(贪心+思维)
题目链接:http://codeforces.com/contest/807/problem/E 题意:已知每个数都能用x=1 + 2 + 4 + ... + 2k - 1 + r (k ≥ 0, 0 ...
- 疯狂的bLue
疯狂的bLue Time Limit: 1000MS Memory Limit: 65536KB Submit Statistic Problem Description 众所周知神秘的 ACM 实验 ...
- mariadb 离线安装
[root@localhost local]# cd /var/local[root@localhost local]# lsmariadb[root@localhost local]# cd /ma ...
- PyTorch在笔记本上实现CUDA加速
最近刚开始学习深度学习,参考了一篇深度学习的入门文章,原文链接:https://medium.freecodecamp.org/everything-you-need-to-know-to-maste ...
- android 之图片异步加载
一.概述 本文来自"慕课网" 的学习,只是对代码做一下分析 图片异步加载有2种方式: (多线程/线程池) 或者 用其实AsyncTask , 其实AsyncTask底层也是用的多 ...
- Redis是否安装
1.Redis对否安装(安装好了会出现下面对应的代码) [lk@localhost /]$ whereis redis-cli redis-cli: /usr/local/bin/redis-cli ...
- HIve实战分析Hadoop的日志
1.日志格式分析首先分析 Hadoop 的日志格式, 日志是一行一条, 日志格式可以依次描述为:日期.时间.级别.相关类和提示信息.如下所示: -03-06 15:23:48,132 INFO org ...
- opencv中IplImage* src = cvLoadImage,错误
在调试这段代码时 IplImage* src = cvLoadImage("D:\\图像\\已处理 - 11.26\\1.jpg", 1); 提示一下错误 引发了异常: 读取访问权 ...