在寻找densnet网络的时候,我发现了一个结构清晰完整的网络代码,在此作备份。

https://github.com/taki0112/Densenet-Tensorflow

Densenet-Tensorflow

Tensorflow implementation of Densenet using Cifar10, MNIST

  • The code that implements this paper is Densenet.py
  • There is a slight difference, I used AdamOptimizer

If you want to see the original author's code or other implementations, please refer to this link

Requirements

  • Tensorflow 1.x
  • Python 3.x
  • tflearn (If you are easy to use global average pooling, you should install tflearn
However, I implemented it using tf.layers, so don't worry

Issue

  • I used tf.contrib.layers.batch_norm
  def Batch_Normalization(x, training, scope):
with arg_scope([batch_norm],
scope=scope,
updates_collections=None,
decay=0.9,
center=True,
scale=True,
zero_debias_moving_mean=True) :
return tf.cond(training,
lambda : batch_norm(inputs=x, is_training=training, reuse=None),
lambda : batch_norm(inputs=x, is_training=training, reuse=True))
  • If not enough GPU memory, Please edit the code
with tf.Session() as sess : NO
with tf.Session(config=tf.ConfigProto(allow_soft_placement=True)) as sess : OK

Idea

What is the "Global Average Pooling" ?

    def Global_Average_Pooling(x, stride=1) :
width = np.shape(x)[1]
height = np.shape(x)[2]
pool_size = [width, height]
return tf.layers.average_pooling2d(inputs=x, pool_size=pool_size, strides=stride)
# The stride value does not matter
If you use tflearn, please refer to this link
def Global_Average_Pooling(x):
return tflearn.layers.conv.global_avg_pool(x, name='Global_avg_pooling')

What is the "Dense Connectivity" ?

What is the "Densenet Architecture" ?

    def Dense_net(self, input_x):
x = conv_layer(input_x, filter=2 * self.filters, kernel=[7,7], stride=2, layer_name='conv0')
x = Max_Pooling(x, pool_size=[3,3], stride=2) x = self.dense_block(input_x=x, nb_layers=6, layer_name='dense_1')
x = self.transition_layer(x, scope='trans_1') x = self.dense_block(input_x=x, nb_layers=12, layer_name='dense_2')
x = self.transition_layer(x, scope='trans_2') x = self.dense_block(input_x=x, nb_layers=48, layer_name='dense_3')
x = self.transition_layer(x, scope='trans_3') x = self.dense_block(input_x=x, nb_layers=32, layer_name='dense_final') x = Batch_Normalization(x, training=self.training, scope='linear_batch')
x = Relu(x)
x = Global_Average_Pooling(x)
x = Linear(x) return x

What is the "Dense Block" ?

   def dense_block(self, input_x, nb_layers, layer_name):
with tf.name_scope(layer_name):
layers_concat = list()
layers_concat.append(input_x) x = self.bottleneck_layer(input_x, scope=layer_name + '_bottleN_' + str(0)) layers_concat.append(x) for i in range(nb_layers - 1):
x = Concatenation(layers_concat)
x = self.bottleneck_layer(x, scope=layer_name + '_bottleN_' + str(i + 1))
layers_concat.append(x) return x

What is the "Bottleneck Layer" ?

 def bottleneck_layer(self, x, scope):
with tf.name_scope(scope):
x = Batch_Normalization(x, training=self.training, scope=scope+'_batch1')
x = Relu(x)
x = conv_layer(x, filter=4 * self.filters, kernel=[1,1], layer_name=scope+'_conv1')
x = Drop_out(x, rate=dropout_rate, training=self.training) x = Batch_Normalization(x, training=self.training, scope=scope+'_batch2')
x = Relu(x)
x = conv_layer(x, filter=self.filters, kernel=[3,3], layer_name=scope+'_conv2')
x = Drop_out(x, rate=dropout_rate, training=self.training) return x

What is the "Transition Layer" ?

    def transition_layer(self, x, scope):
with tf.name_scope(scope):
x = Batch_Normalization(x, training=self.training, scope=scope+'_batch1')
x = Relu(x)
x = conv_layer(x, filter=self.filters, kernel=[1,1], layer_name=scope+'_conv1')
x = Drop_out(x, rate=dropout_rate, training=self.training)
x = Average_pooling(x, pool_size=[2,2], stride=2) return x

Compare Structure (CNN, ResNet, DenseNet)

Results

  • (MNIST) The highest test accuracy is 99.2% (This result does not use dropout)
  • The number of dense block layers is fixed to 4
    for i in range(self.nb_blocks) :
# original : 6 -> 12 -> 48 x = self.dense_block(input_x=x, nb_layers=4, layer_name='dense_'+str(i))
x = self.transition_layer(x, scope='trans_'+str(i))

CIFAR-10

CIFAR-100

Image Net

Related works

References

Author

Junho Kim

Densenet-Tensorflow的更多相关文章

  1. densenet tensorflow 中文汉字手写识别

    densenet 中文汉字手写识别,代码如下: import tensorflow as tf import os import random import math import tensorflo ...

  2. tensorflow学习笔记——DenseNet

    完整代码及其数据,请移步小编的GitHub地址 传送门:请点击我 如果点击有误:https://github.com/LeBron-Jian/DeepLearningNote 这里结合网络的资料和De ...

  3. TensorFlow从1到2(五)图片内容识别和自然语言语义识别

    Keras内置的预定义模型 上一节我们讲过了完整的保存模型及其训练完成的参数. Keras中使用这种方式,预置了多个著名的成熟神经网络模型.当然,这实际是Keras的功劳,并不适合算在TensorFl ...

  4. 从零开始自己搭建复杂网络2(以Tensorflow为例)

    从零开始自己搭建复杂网络(以DenseNet为例) DenseNet 是一种具有密集连接的卷积神经网络.在该网络中,任何两层之间都有直接的连接,也就是说,网络每一层的输入都是前面所有层输出的并集, 而 ...

  5. Tensorflow 之finetune微调模型方法&&不同层上设置不同的学习率

    在不同层上设置不同的学习率,fine-tuning https://github.com/dgurkaynak/tensorflow-cnn-finetune ConvNets: AlexNet VG ...

  6. DenseNet算法详解——思路就是highway,DneseNet在训练时十分消耗内存

    论文笔记:Densely Connected Convolutional Networks(DenseNet模型详解) 2017年09月28日 11:58:49 阅读数:1814 [ 转载自http: ...

  7. W tensorflow/core/util/ctc/ctc_loss_calculator.cc:144] No valid path found 或 loss:inf的解决方案

    基于Tensorflow和Keras实现端到端的不定长中文字符检测和识别(文本检测:CTPN,文本识别:DenseNet + CTC),在使用自己的数据训练这个模型的过程中,出现如下错误,由于问题已经 ...

  8. tensorflow+inceptionv3图像分类网络结构的解析与代码实现

    tensorflow+inceptionv3图像分类网络结构的解析与代码实现 论文链接:论文地址 ResNet传送门:Resnet-cifar10 DenseNet传送门:DenseNet SegNe ...

  9. TensorFlow中的语义分割套件

    TensorFlow中的语义分割套件 描述 该存储库用作语义细分套件.目标是轻松实现,训练和测试新的语义细分模型!完成以下内容: 训练和测试方式 资料扩充 几种最先进的模型.轻松随插即用 能够使用任何 ...

  10. Tensorflow 官方版教程中文版

    2015年11月9日,Google发布人工智能系统TensorFlow并宣布开源,同日,极客学院组织在线TensorFlow中文文档翻译.一个月后,30章文档全部翻译校对完成,上线并提供电子书下载,该 ...

随机推荐

  1. everything 提供http和ftp的功能

    1. 早上起床看知乎,发现everything 有http和ftp的功能, 简单看了一下的确很强大.. 就是有点危险.. 功能位置. 2. 最下面有FTP和HTTP 可以进行启用 这是http的 建议 ...

  2. [转帖]Oracle 11G RAC For Windows 2008 R2部署手册

    Oracle 11G RAC For Windows 2008 R2部署手册(亲测,成功实施多次) https://www.cnblogs.com/yhfssp/p/7821593.html 总体规划 ...

  3. sessionStorage & localStorage in-depth

    sessionStorage & localStorage in-depth Web Storage API https://developer.mozilla.org/en-US/docs/ ...

  4. 【转】在SpringMVC Controller中注入Request成员域

    原文链接:https://www.cnblogs.com/abcwt112/p/7777258.html 原文作者:abcwt112 主题 在工作中遇到1个问题....我们定义了一个Controlle ...

  5. jQuery简单效果

  6. javascript面向对象系列第五篇——拖拽的实现

    前面的话 在之前的博客中,拖拽的实现使用了面向过程的写法.本文将以面向对象的写法来实现拖拽 写法 <style> .test{height: 50px;width: 50px;backgr ...

  7. Pycharm+Python+PyQt5使用

    https://www.cnblogs.com/dalanjing/p/6978373.html

  8. ubuntu 16.04 kdump 使用

    1.安装linux-crashdump及kdump sudo apt-get install linux-crashdump sudo apt-get install kexec-tool 2.重启电 ...

  9. 【刷题】BZOJ 2001 [Hnoi2010]City 城市建设

    Description PS国是一个拥有诸多城市的大国,国王Louis为城市的交通建设可谓绞尽脑汁.Louis可以在某些城市之间修建道路,在不同的城市之间修建道路需要不同的花费.Louis希望建造最少 ...

  10. 【UOJ #351】新年的叶子(树的直径,期望)

    题目链接 这的确是一道好题,我们不妨依循思路一步步推导,看问题是如何被解决的. 做一些约定,设$m$为树的叶子节点个数,设$len$为该树的直径(经过的点数). 毫无疑问,直径可能有多条,我们需要把所 ...