[caffe]深度学习之图像分类模型VGG解读
一、简单介绍
vgg和googlenet是2014年imagenet竞赛的双雄,这两类模型结构有一个共同特点是go deeper。跟googlenet不同的是。vgg继承了lenet以及alexnet的一些框架。尤其是跟alexnet框架很像。vgg也是5个group的卷积、2层fc图像特征、一层fc分类特征,能够看做和alexnet一样总共8个part。依据前5个卷积group。每一个group中的不同配置,vgg论文中给出了A~E这五种配置。卷积层数从8到16递增。
从论文中能够看到从8到16随着卷积层的一步步加深,貌似通过加深卷积层数也已经到达准确率提升的瓶颈了。后面的有一些论文针对卷积层输入的前处理(比如batch
normalization)和输出的后处理(比如prelu)做了研究。再进一步的提升方向会是什么呢?这个值得大家去深入思考。
二、网络分析
我依据http://cs.stanford.edu/people/karpathy/vgg_train_val.prototxt配置文件以及vgg论文指导改动得到了vgg_A网络结构。
在改动的过程中你会发现vgg为了做不同深度网络之间的对照,然后又不至于太多的改动网络。vgg中给全部的卷积层以及pool层都设置了一样的层操作參数,确保了每一个group出来的shape都是一致的。无论你在卷积group中加多少层的卷积。
三、网络日志
下面是详细的shape log:
i0701 17:01:10.548092 26739 data_layer.cpp:85] output data size: 100,3,224,224
i0701 17:01:10.736845 26739 net.cpp:206] top shape: 100 3 224 224 (15052800)
i0701 17:01:10.736912 26739 net.cpp:206] top shape: 100 (100)
i0701 17:01:10.736929 26739 layer_factory.hpp:75] creating layer conv1_1
i0701 17:01:10.736968 26739 net.cpp:166] creating layer conv1_1
i0701 17:01:10.736979 26739 net.cpp:496] conv1_1 <- data
i0701 17:01:10.737004 26739 net.cpp:452] conv1_1 -> conv1_1
i0701 17:01:10.737030 26739 net.cpp:197] setting up conv1_1
i0701 17:01:10.738733 26739 net.cpp:206] top shape: 100 64 224 224 (321126400)
i0701 17:01:10.738770 26739 layer_factory.hpp:75] creating layer relu1_1
i0701 17:01:10.738786 26739 net.cpp:166] creating layer relu1_1
i0701 17:01:10.738824 26739 net.cpp:496] relu1_1 <- conv1_1
i0701 17:01:10.738838 26739 net.cpp:439] relu1_1 -> conv1_1 (in-place)
i0701 17:01:10.738853 26739 net.cpp:197] setting up relu1_1
i0701 17:01:10.738867 26739 net.cpp:206] top shape: 100 64 224 224 (321126400)
i0701 17:01:10.738878 26739 layer_factory.hpp:75] creating layer pool1
i0701 17:01:10.738890 26739 net.cpp:166] creating layer pool1
i0701 17:01:10.738900 26739 net.cpp:496] pool1 <- conv1_1
i0701 17:01:10.738914 26739 net.cpp:452] pool1 -> pool1
i0701 17:01:10.738930 26739 net.cpp:197] setting up pool1
i0701 17:01:10.738963 26739 net.cpp:206] top shape: 100 64 112 112 (80281600)
i0701 17:01:10.738975 26739 layer_factory.hpp:75] creating layer conv2_1
i0701 17:01:10.738992 26739 net.cpp:166] creating layer conv2_1
i0701 17:01:10.739001 26739 net.cpp:496] conv2_1 <- pool1
i0701 17:01:10.739017 26739 net.cpp:452] conv2_1 -> conv2_1
i0701 17:01:10.739030 26739 net.cpp:197] setting up conv2_1
i0701 17:01:10.746640 26739 net.cpp:206] top shape: 100 128 112 112 (160563200)
i0701 17:01:10.746669 26739 layer_factory.hpp:75] creating layer relu2_1
i0701 17:01:10.746682 26739 net.cpp:166] creating layer relu2_1
i0701 17:01:10.746691 26739 net.cpp:496] relu2_1 <- conv2_1
i0701 17:01:10.746702 26739 net.cpp:439] relu2_1 -> conv2_1 (in-place)
i0701 17:01:10.746714 26739 net.cpp:197] setting up relu2_1
i0701 17:01:10.746726 26739 net.cpp:206] top shape: 100 128 112 112 (160563200)
i0701 17:01:10.746734 26739 layer_factory.hpp:75] creating layer pool2
i0701 17:01:10.746749 26739 net.cpp:166] creating layer pool2
i0701 17:01:10.746759 26739 net.cpp:496] pool2 <- conv2_1
i0701 17:01:10.746770 26739 net.cpp:452] pool2 -> pool2
i0701 17:01:10.746783 26739 net.cpp:197] setting up pool2
i0701 17:01:10.746798 26739 net.cpp:206] top shape: 100 128 56 56 (40140800)
i0701 17:01:10.746809 26739 layer_factory.hpp:75] creating layer conv3_1
i0701 17:01:10.746809 26739 layer_factory.hpp:75] creating layer conv3_1
i0701 17:01:10.746825 26739 net.cpp:166] creating layer conv3_1
i0701 17:01:10.746835 26739 net.cpp:496] conv3_1 <- pool2
i0701 17:01:10.746846 26739 net.cpp:452] conv3_1 -> conv3_1
i0701 17:01:10.746860 26739 net.cpp:197] setting up conv3_1
i0701 17:01:10.747910 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.747939 26739 layer_factory.hpp:75] creating layer relu3_1
i0701 17:01:10.747954 26739 net.cpp:166] creating layer relu3_1
i0701 17:01:10.747963 26739 net.cpp:496] relu3_1 <- conv3_1
i0701 17:01:10.747974 26739 net.cpp:439] relu3_1 -> conv3_1 (in-place)
i0701 17:01:10.747985 26739 net.cpp:197] setting up relu3_1
i0701 17:01:10.747997 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.748009 26739 layer_factory.hpp:75] creating layer conv3_2
i0701 17:01:10.748021 26739 net.cpp:166] creating layer conv3_2
i0701 17:01:10.748030 26739 net.cpp:496] conv3_2 <- conv3_1
i0701 17:01:10.748045 26739 net.cpp:452] conv3_2 -> conv3_2
i0701 17:01:10.748060 26739 net.cpp:197] setting up conv3_2
i0701 17:01:10.750586 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.750610 26739 layer_factory.hpp:75] creating layer relu3_2
i0701 17:01:10.750624 26739 net.cpp:166] creating layer relu3_2
i0701 17:01:10.750635 26739 net.cpp:496] relu3_2 <- conv3_2
i0701 17:01:10.750648 26739 net.cpp:439] relu3_2 -> conv3_2 (in-place)
i0701 17:01:10.750669 26739 net.cpp:197] setting up relu3_2
i0701 17:01:10.750681 26739 net.cpp:206] top shape: 100 256 56 56 (80281600)
i0701 17:01:10.750690 26739 layer_factory.hpp:75] creating layer pool3
i0701 17:01:10.750702 26739 net.cpp:166] creating layer pool3
i0701 17:01:10.750710 26739 net.cpp:496] pool3 <- conv3_2
i0701 17:01:10.750725 26739 net.cpp:452] pool3 -> pool3
i0701 17:01:10.750740 26739 net.cpp:197] setting up pool3
i0701 17:01:10.750756 26739 net.cpp:206] top shape: 100 256 28 28 (20070400)
i0701 17:01:10.750764 26739 layer_factory.hpp:75] creating layer conv4_1
i0701 17:01:10.750779 26739 net.cpp:166] creating layer conv4_1
i0701 17:01:10.750788 26739 net.cpp:496] conv4_1 <- pool3
i0701 17:01:10.750800 26739 net.cpp:452] conv4_1 -> conv4_1
i0701 17:01:10.750825 26739 net.cpp:197] setting up conv4_1
i0701 17:01:10.756436 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.756474 26739 layer_factory.hpp:75] creating layer relu4_1
i0701 17:01:10.756489 26739 net.cpp:166] creating layer relu4_1
i0701 17:01:10.756499 26739 net.cpp:496] relu4_1 <- conv4_1
i0701 17:01:10.756510 26739 net.cpp:439] relu4_1 -> conv4_1 (in-place)
i0701 17:01:10.756523 26739 net.cpp:197] setting up relu4_1
i0701 17:01:10.756536 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.756546 26739 layer_factory.hpp:75] creating layer conv4_2
i0701 17:01:10.756559 26739 net.cpp:166] creating layer conv4_2
i0701 17:01:10.756568 26739 net.cpp:496] conv4_2 <- conv4_1
i0701 17:01:10.756583 26739 net.cpp:452] conv4_2 -> conv4_2
i0701 17:01:10.756597 26739 net.cpp:197] setting up conv4_2
i0701 17:01:10.766434 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.766474 26739 layer_factory.hpp:75] creating layer relu4_2
i0701 17:01:10.766490 26739 net.cpp:166] creating layer relu4_2
i0701 17:01:10.766500 26739 net.cpp:496] relu4_2 <- conv4_2
i0701 17:01:10.766513 26739 net.cpp:439] relu4_2 -> conv4_2 (in-place)
i0701 17:01:10.766531 26739 net.cpp:197] setting up relu4_2
i0701 17:01:10.766543 26739 net.cpp:206] top shape: 100 512 28 28 (40140800)
i0701 17:01:10.766552 26739 layer_factory.hpp:75] creating layer pool4
i0701 17:01:10.766573 26739 net.cpp:166] creating layer pool4
i0701 17:01:10.766582 26739 net.cpp:496] pool4 <- conv4_2
i0701 17:01:10.766595 26739 net.cpp:452] pool4 -> pool4
i0701 17:01:10.766608 26739 net.cpp:197] setting up pool4
i0701 17:01:10.766624 26739 net.cpp:206] top shape: 100 512 14 14 (10035200)
I0701 17:18:56.158187 29940 layer_factory.hpp:75] Creating layer conv5_1
I0701 17:18:56.158201 29940 net.cpp:166] Creating Layer conv5_1
I0701 17:18:56.158210 29940 net.cpp:496] conv5_1 <- pool4
I0701 17:18:56.158222 29940 net.cpp:452] conv5_1 -> conv5_1
I0701 17:18:56.158234 29940 net.cpp:197] Setting up conv5_1
I0701 17:18:56.168265 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.168303 29940 layer_factory.hpp:75] Creating layer relu5_1
I0701 17:18:56.168320 29940 net.cpp:166] Creating Layer relu5_1
I0701 17:18:56.168329 29940 net.cpp:496] relu5_1 <- conv5_1
I0701 17:18:56.168341 29940 net.cpp:439] relu5_1 -> conv5_1 (in-place)
I0701 17:18:56.168355 29940 net.cpp:197] Setting up relu5_1
I0701 17:18:56.168368 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.168378 29940 layer_factory.hpp:75] Creating layer conv5_2
I0701 17:18:56.168395 29940 net.cpp:166] Creating Layer conv5_2
I0701 17:18:56.168406 29940 net.cpp:496] conv5_2 <- conv5_1
I0701 17:18:56.168417 29940 net.cpp:452] conv5_2 -> conv5_2
I0701 17:18:56.168431 29940 net.cpp:197] Setting up conv5_2
I0701 17:18:56.178441 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.178478 29940 layer_factory.hpp:75] Creating layer relu5_2
I0701 17:18:56.178493 29940 net.cpp:166] Creating Layer relu5_2
I0701 17:18:56.178501 29940 net.cpp:496] relu5_2 <- conv5_2
I0701 17:18:56.178514 29940 net.cpp:439] relu5_2 -> conv5_2 (in-place)
I0701 17:18:56.178527 29940 net.cpp:197] Setting up relu5_2
I0701 17:18:56.178539 29940 net.cpp:206] Top shape: 100 512 14 14 (10035200)
I0701 17:18:56.178547 29940 layer_factory.hpp:75] Creating layer pool5
I0701 17:18:56.178561 29940 net.cpp:166] Creating Layer pool5
I0701 17:18:56.178570 29940 net.cpp:496] pool5 <- conv5_2
I0701 17:18:56.178581 29940 net.cpp:452] pool5 -> pool5
I0701 17:18:56.178596 29940 net.cpp:197] Setting up pool5
I0701 17:18:56.178611 29940 net.cpp:206] Top shape: 100 512 7 7 (2508800)
I0701 17:18:56.178621 29940 layer_factory.hpp:75] Creating layer fc6
i0701 17:01:10.796613 26739 net.cpp:166] creating layer fc6
i0701 17:01:10.796622 26739 net.cpp:496] fc6 <- pool5
i0701 17:01:10.796634 26739 net.cpp:452] fc6 -> fc6
i0701 17:01:10.796650 26739 net.cpp:197] setting up fc6
i0701 17:01:11.236284 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236351 26739 layer_factory.hpp:75] creating layer relu6
i0701 17:01:11.236373 26739 net.cpp:166] creating layer relu6
i0701 17:01:11.236384 26739 net.cpp:496] relu6 <- fc6
i0701 17:01:11.236404 26739 net.cpp:439] relu6 -> fc6 (in-place)
i0701 17:01:11.236423 26739 net.cpp:197] setting up relu6
i0701 17:01:11.236435 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236444 26739 layer_factory.hpp:75] creating layer drop6
i0701 17:01:11.236464 26739 net.cpp:166] creating layer drop6
i0701 17:01:11.236472 26739 net.cpp:496] drop6 <- fc6
i0701 17:01:11.236486 26739 net.cpp:439] drop6 -> fc6 (in-place)
i0701 17:01:11.236500 26739 net.cpp:197] setting up drop6
i0701 17:01:11.236524 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.236534 26739 layer_factory.hpp:75] creating layer fc7
i0701 17:01:11.236549 26739 net.cpp:166] creating layer fc7
i0701 17:01:11.236557 26739 net.cpp:496] fc7 <- fc6
i0701 17:01:11.236569 26739 net.cpp:452] fc7 -> fc7
i0701 17:01:11.236585 26739 net.cpp:197] setting up fc7
i0701 17:01:11.301771 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.301842 26739 layer_factory.hpp:75] creating layer relu7
i0701 17:01:11.301864 26739 net.cpp:166] creating layer relu7
i0701 17:01:11.301877 26739 net.cpp:496] relu7 <- fc7
i0701 17:01:11.301898 26739 net.cpp:439] relu7 -> fc7 (in-place)
i0701 17:01:11.301916 26739 net.cpp:197] setting up relu7
i0701 17:01:11.301929 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.301939 26739 layer_factory.hpp:75] creating layer drop7
i0701 17:01:11.301954 26739 net.cpp:166] creating layer drop7
i0701 17:01:11.301962 26739 net.cpp:496] drop7 <- fc7
i0701 17:01:11.301972 26739 net.cpp:439] drop7 -> fc7 (in-place)
i0701 17:01:11.301985 26739 net.cpp:197] setting up drop7
i0701 17:01:11.302000 26739 net.cpp:206] top shape: 100 4096 (409600)
i0701 17:01:11.302008 26739 layer_factory.hpp:75] creating layer fc8
i0701 17:01:11.302023 26739 net.cpp:166] creating layer fc8
i0701 17:01:11.302032 26739 net.cpp:496] fc8 <- fc7
i0701 17:01:11.302044 26739 net.cpp:452] fc8 -> fc8
i0701 17:01:11.302058 26739 net.cpp:197] setting up fc8
i0701 17:01:11.464764 26739 net.cpp:206] top shape: 100 10000 (1000000)
shape的计算方式能够參考[caffe]深度学习之图像分类模型AlexNet解读。
[caffe]深度学习之图像分类模型VGG解读的更多相关文章
- 【转】[caffe]深度学习之图像分类模型AlexNet解读
[caffe]深度学习之图像分类模型AlexNet解读 原文地址:http://blog.csdn.net/sunbaigui/article/details/39938097 本文章已收录于: ...
- [caffe]深度学习之图像分类模型AlexNet解读
在imagenet上的图像分类challenge上Alex提出的alexnet网络结构模型赢得了2012届的冠军.要研究CNN类型DL网络模型在图像分类上的应用,就逃不开研究alexnet.这是CNN ...
- Caffe 深度学习框架上手教程
Caffe 深度学习框架上手教程 blink 15年1月 Caffe (CNN, deep learning) 介绍 Caffe -----------Convolution Architec ...
- [转]Caffe 深度学习框架上手教程
Caffe 深度学习框架上手教程 机器学习Caffe caffe 原文地址:http://suanfazu.com/t/caffe/281 blink 15年1月 6 Caffe448是一个清 ...
- supervessel-免费云镜像︱GPU加速的Caffe深度学习开发环境
开发环境介绍 在SuperVessel云上,我们为大家免费提供当前火热的caffe深度学习开发环境.SuperVessel的Caffe有如下优点: 1) 免去了繁琐的Caffe环境的安装配置,即申请即 ...
- 深度学习的seq2seq模型——本质是LSTM,训练过程是使得所有样本的p(y1,...,yT‘|x1,...,xT)概率之和最大
from:https://baijiahao.baidu.com/s?id=1584177164196579663&wfr=spider&for=pc seq2seq模型是以编码(En ...
- 深度学习 vs. 概率图模型 vs. 逻辑学
深度学习 vs. 概率图模型 vs. 逻辑学 摘要:本文回顾过去50年人工智能(AI)领域形成的三大范式:逻辑学.概率方法和深度学习.文章按时间顺序展开,先回顾逻辑学和概率图方法,然后就人工智能和机器 ...
- 时间序列深度学习:seq2seq 模型预测太阳黑子
目录 时间序列深度学习:seq2seq 模型预测太阳黑子 学习路线 商业中的时间序列深度学习 商业中应用时间序列深度学习 深度学习时间序列预测:使用 keras 预测太阳黑子 递归神经网络 设置.预处 ...
- Ubuntu 14.04 安装caffe深度学习框架
简介:如何在ubuntu 14.04 下安装caffe深度学习框架. 注:安装caffe时一定要保持网络状态好,不然会遇到很多麻烦.例如下载不了,各种报错. 一.安装依赖包 $ sudo apt-ge ...
随机推荐
- MUH and Cube Walls
Codeforces Round #269 (Div. 2) D:http://codeforces.com/problemset/problem/471/D 题意:给定两个序列a ,b, 如果在a中 ...
- loadrunner 一个诡异问题
最近使用loadrunner压测一个项目的时候,发现TPS波动巨大.且平均值较低.使用jmeter压测则没有这个问题.经过多方排查发现一个让人极度费解的原因: 原脚本: //脚本其他代码...... ...
- 汇编语言第二版 程序在dos中执行情况.P86-87
假设程序要被dos系统加载到sa:0000的内存中,在这个地址的内存开始会有256个字节的PSP程序,用于加载程序和dos系统的通信.ds中的地址为sa. 真正的程序会在这256个字节之后.所以真正程 ...
- iis下设置默认页
IIS设置设默认页 计算机-->右键管理-->服务器和应用程序-->Internet信息服务 -->网站-->你发布的网站名-->功能视图-->IIS大类里- ...
- Yorhom浅谈:作为一名初中生,自学编程的点点滴滴 - Yorhom's Game Box
Yorhom浅谈:作为一名初中生,自学编程的点点滴滴 我是一名不折不扣的初中生,白天要背着书包去上学,晚上要拿起笔写作业.天天如此,年年如此. 我的爱好很广泛,喜欢了解历史,读侦探小说,骑车,打篮球, ...
- wzplayer for android V1.6.1 (支持音视频加密播放)
1.更新 2013-11-25: 1.6.1 修复1.6.0版本对rk版本的支持. 以往版本: 1.6.0 1)1.6.0修改了所有默认音频渲染使用AudioTrack输出,这样只要不播放视频,能支持 ...
- Android开发UI之Navigation Drawer
http://blog.csdn.net/xyz_lmn/article/details/12523895
- java数据类型图:
java数据类型图: ┏数值型━┳━整数型:byte short int long ┏基本数据类型━━┫ ...
- bzoj1054
弱弱的搜索题, 我的做法是将矩阵看做二进制然后用位运算来做的,感觉比较舒服 ..] ,,,); dy:..] ,,-,); type node=record po,next: ...
- BZOJ_1005_ [HNOI2008]_明明的烦恼_(组合数学+purfer_sequence+高精度+分解因数+快速幂)
描述 http://www.lydsy.com/JudgeOnline/problem.php?id=1005 一棵树有n个点,给出没给节点的度,如果没有限制则为-1,求共有多少种可能的树. 分析 蒟 ...