【深夜福利】Caffe 添加自己定义 Layer 及其 ProtoBuffer 參数
在飞驰的列车上,无法入眠。外面阴雨绵绵,思绪被拉扯到天边。
翻看之前聊天,想起还欠一个读者一篇博客。
于是花了点时间整理一下之前学习 Caffe 时添加自己定义 Layer 及自己定义 ProtoBuffer 參数的简单例程,希望对刚開始学习的人有借鉴意义。
博客内容基于新书《深度学习:21 天实战 Caffe》。书中课后习题答案欢迎读者留言讨论。
以下进入正文。
在使用 Caffe 过程中常常会有这种需求:已有 Layer 不符合我的应用场景;我须要这样这种功能。原版代码没有实现。或者已经实现但效率太低。我有更好的实现。
方案一:简单粗暴的解法——偷天换日
假设你对 ConvolutionLayer 的实现不惬意,那就直接改这两个文件:$CAFFE_ROOT/include/caffe/layers/conv_layer.hpp 和 $CAFFE_ROOT/src/caffe/layers/conv_layer.cpp 或 conv_layer.cu 。将 im2col + gemm 替换为你自己的实现(比方基于 winograd 算法的实现)。
长处:高速迭代,不须要对 Caffe 框架有过多了解,糙快狠准。
缺点:代码难维护,不能 merge 到 caffe master branch,easy给使用代码的人带来困惑(效果和 #define TRUE false 几乎相同)。
方案二:略微温和的解法——千人千面
和方案一相似,仅仅是通过预编译宏来确定使用哪种实现。
比如能够保留 ConvolutionLayer 默认实现,同一时候在代码中添加例如以下段:
#ifdef SWITCH_MY_IMPLEMENTATION
// 你的实现代码
#else
// 默认代码
#endif
这样能够在须要使用该 Layer 的代码中,添加宏定义:
#define SWITCH_MY_IMPLEMENTATION
就能够使用你的实现。而没有定义该宏的代码,仍然使用原版实现。
长处:能够在新旧实现代码之间灵活切换;
缺点:每次切换须要又一次编译。
方案三:优雅转身——山路十八弯
同一个功能的 Layer 有不同实现。希望能灵活切换又不须要又一次编译代码,该怎样实现?
这时不得不使用 ProtoBuffer 工具了。
首先。要把你的实现,要像正常的 Layer 类一样,分解为声明部分和实现部分。分别放在 .hpp 与 .cpp、.cu 中。Layer 名称要起一个能差别于原版实现的新名称。.hpp 文件置于 $CAFFE_ROOT/include/caffe/layers/,而 .cpp 和 .cu 置于 $CAFFE_ROOT/src/caffe/layers/,这样你在 $CAFFE_ROOT 下运行 make 编译时。会自己主动将这些文件添加构建过程,省去了手动设置编译选项的繁琐流程。
其次,在 $CAFFE_ROOT/src/caffe/proto/caffe.proto 中,添加新 LayerParameter 选项,这样你在编写 train.prototxt 或者 test.prototxt 或者 deploy.prototxt 时就能把新 Layer 的描写叙述写进去,便于改动网络结构和替换其它同样功能的 Layer 了。
最后也是最easy忽视的一点,在 Layer 工厂注冊新 Layer 加工函数,不然在你运行过程中可能会报例如以下错误:
F1002 01:51:22.656038 1954701312 layer_factory.hpp:81] Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: AllPass (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, ImageData, InfogainLoss, InnerProduct, Input, LRN, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Pooling, Power, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, TanH, Threshold, Tile, WindowData)
*** Check failure stack trace: ***
@ 0x10243154e google::LogMessage::Fail()
@ 0x102430c53 google::LogMessage::SendToLog()
@ 0x1024311a9 google::LogMessage::Flush()
@ 0x1024344d7 google::LogMessageFatal::~LogMessageFatal()
@ 0x10243183b google::LogMessageFatal::~LogMessageFatal()
@ 0x102215356 caffe::LayerRegistry<>::CreateLayer()
@ 0x102233ccf caffe::Net<>::Init()
@ 0x102235996 caffe::Net<>::Net()
@ 0x102118d8b time()
@ 0x102119c9a main
@ 0x7fff851285ad start
@ 0x4 (unknown)
Abort trap: 6
以下给出一个实际案例。走一遍方案三的流程。
这里我们实现一个新 Layer,名称为 AllPassLayer。顾名思义就是全通 Layer,“全通”借鉴于信号处理中的全通滤波器,将信号无失真地从输入转到输出。
尽管这个 Layer 并没有什么卵用,可是在这个基础上添加你的处理是很easy的事情。
另外也是出于实验考虑。全通层的 Forward/Backward 函数很easy不须要读者有不论什么高等数学和求导的背景知识。读者使用该层时能够插入到不论什么已有网络中,而不会影响训练、预測的准确性。
首先看头文件:
#ifndef CAFFE_ALL_PASS_LAYER_HPP_
#define CAFFE_ALL_PASS_LAYER_HPP_ #include <vector> #include "caffe/blob.hpp"
#include "caffe/layer.hpp"
#include "caffe/proto/caffe.pb.h" #include "caffe/layers/neuron_layer.hpp" namespace caffe {
template <typename Dtype>
class AllPassLayer : public NeuronLayer<Dtype> {
public:
explicit AllPassLayer(const LayerParameter& param)
: NeuronLayer<Dtype>(param) {} virtual inline const char* type() const { return "AllPass"; } protected: virtual void Forward_cpu(const vector<Blob<Dtype>*>& bottom,
const vector<Blob<Dtype>*>& top);
virtual void Forward_gpu(const vector<Blob<Dtype>*>& bottom,
const vector<Blob<Dtype>*>& top);
virtual void Backward_cpu(const vector<Blob<Dtype>*>& top,
const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom);
virtual void Backward_gpu(const vector<Blob<Dtype>*>& top,
const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom);
}; } // namespace caffe #endif // CAFFE_ALL_PASS_LAYER_HPP_
再看源文件:
#include <algorithm>
#include <vector> #include "caffe/layers/all_pass_layer.hpp" #include <iostream>
using namespace std;
#define DEBUG_AP(str) cout<<str<<endl
namespace caffe { template <typename Dtype>
void AllPassLayer<Dtype>::Forward_cpu(const vector<Blob<Dtype>*>& bottom,
const vector<Blob<Dtype>*>& top) {
const Dtype* bottom_data = bottom[0]->cpu_data();
Dtype* top_data = top[0]->mutable_cpu_data();
const int count = bottom[0]->count();
for (int i = 0; i < count; ++i) {
top_data[i] = bottom_data[i];
}
DEBUG_AP("Here is All Pass Layer, forwarding.");
DEBUG_AP(this->layer_param_.all_pass_param().key());
} template <typename Dtype>
void AllPassLayer<Dtype>::Backward_cpu(const vector<Blob<Dtype>*>& top,
const vector<bool>& propagate_down,
const vector<Blob<Dtype>*>& bottom) {
if (propagate_down[0]) {
const Dtype* bottom_data = bottom[0]->cpu_data();
const Dtype* top_diff = top[0]->cpu_diff();
Dtype* bottom_diff = bottom[0]->mutable_cpu_diff();
const int count = bottom[0]->count();
for (int i = 0; i < count; ++i) {
bottom_diff[i] = top_diff[i];
}
}
DEBUG_AP("Here is All Pass Layer, backwarding.");
DEBUG_AP(this->layer_param_.all_pass_param().key());
} #ifdef CPU_ONLY
STUB_GPU(AllPassLayer);
#endif INSTANTIATE_CLASS(AllPassLayer);
REGISTER_LAYER_CLASS(AllPass);
} // namespace caffe
时间考虑,我没有实现 GPU 模式的 forward、backward。故本文例程仅支持 CPU_ONLY 模式。
编辑 caffe.proto。找到 LayerParameter 描写叙述,添加一项:
message LayerParameter {
  optional string name = 1; // the layer name
  optional string type = 2; // the layer type
  repeated string bottom = 3; // the name of each bottom blob
  repeated string top = 4; // the name of each top blob
  // The train / test phase for computation.
  optional Phase phase = 10;
  // The amount of weight to assign each top blob in the objective.
  // Each layer assigns a default value, usually of either 0 or 1,
  // to each top blob.
  repeated float loss_weight = 5;
  // Specifies training parameters (multipliers on global learning constants,
  // and the name and other settings used for weight sharing).
  repeated ParamSpec param = 6;
  // The blobs containing the numeric parameters of the layer.
  repeated BlobProto blobs = 7;
  // Specifies on which bottoms the backpropagation should be skipped.
  // The size must be either 0 or equal to the number of bottoms.
  repeated bool propagate_down = 11;
  // Rules controlling whether and when a layer is included in the network,
  // based on the current NetState.  You may specify a non-zero number of rules
  // to include OR exclude, but not both.  If no include or exclude rules are
  // specified, the layer is always included.  If the current NetState meets
  // ANY (i.e., one or more) of the specified rules, the layer is
  // included/excluded.
  repeated NetStateRule include = 8;
  repeated NetStateRule exclude = 9;
  // Parameters for data pre-processing.
  optional TransformationParameter transform_param = 100;
  // Parameters shared by loss layers.
  optional LossParameter loss_param = 101;
  // Layer type-specific parameters.
  //
  // Note: certain layers may have more than one computational engine
  // for their implementation. These layers include an Engine type and
  // engine parameter for selecting the implementation.
  // The default for the engine is set by the ENGINE switch at compile-time.
  optional AccuracyParameter accuracy_param = 102;
  optional ArgMaxParameter argmax_param = 103;
  optional BatchNormParameter batch_norm_param = 139;
  optional BiasParameter bias_param = 141;
  optional ConcatParameter concat_param = 104;
  optional ContrastiveLossParameter contrastive_loss_param = 105;
  optional ConvolutionParameter convolution_param = 106;
  optional CropParameter crop_param = 144;
  optional DataParameter data_param = 107;
  optional DropoutParameter dropout_param = 108;
  optional DummyDataParameter dummy_data_param = 109;
  optional EltwiseParameter eltwise_param = 110;
  optional ELUParameter elu_param = 140;
  optional EmbedParameter embed_param = 137;
  optional ExpParameter exp_param = 111;
  optional FlattenParameter flatten_param = 135;
  optional HDF5DataParameter hdf5_data_param = 112;
  optional HDF5OutputParameter hdf5_output_param = 113;
  optional HingeLossParameter hinge_loss_param = 114;
  optional ImageDataParameter image_data_param = 115;
  optional InfogainLossParameter infogain_loss_param = 116;
  optional InnerProductParameter inner_product_param = 117;
  optional InputParameter input_param = 143;
  optional LogParameter log_param = 134;
  optional LRNParameter lrn_param = 118;
  optional MemoryDataParameter memory_data_param = 119;
  optional MVNParameter mvn_param = 120;
  optional PoolingParameter pooling_param = 121;
  optional PowerParameter power_param = 122;
  optional PReLUParameter prelu_param = 131;
  optional PythonParameter python_param = 130;
  optional ReductionParameter reduction_param = 136;
  optional ReLUParameter relu_param = 123;
  optional ReshapeParameter reshape_param = 133;
  optional ScaleParameter scale_param = 142;
  optional SigmoidParameter sigmoid_param = 124;
  optional SoftmaxParameter softmax_param = 125;
  optional SPPParameter spp_param = 132;
  optional SliceParameter slice_param = 126;
  optional TanHParameter tanh_param = 127;
  optional ThresholdParameter threshold_param = 128;
  optional TileParameter tile_param = 138;
  optional WindowDataParameter window_data_param = 129;
  optional AllPassParameter all_pass_param = 155;
}
注意新增数字不要和曾经的 Layer 数字反复。
仍然在 caffe.proto 中,添加 AllPassParameter 声明,位置随意。我设定了一个參数,能够用于从 prototxt 中读取预设值。
message AllPassParameter {
  optional float key = 1 [default = 0];
}
在 cpp 代码中,通过
this->layer_param_.all_pass_param().key()
这句来读取 prototxt 预设值。
在 $CAFFE_ROOT 下运行 make clean,然后又一次 make all。要想一次编译成功,务必规范代码,对常见错误保持敏锐的嗅觉并加以避免。
万事具备,仅仅欠 prototxt 了。
不难,我们写个最简单的 deploy.prototxt,不须要 data layer 和 softmax layer。just for fun。
name: "AllPassTest"
layer {
name: "data"
type: "Input"
top: "data"
input_param { shape: { dim: 10 dim: 3 dim: 227 dim: 227 } }
}
layer {
name: "ap"
type: "AllPass"
bottom: "data"
top: "conv1"
all_pass_param {
key: 12.88
}
}
注意,这里的 type :后面写的内容,应该是你在 .hpp 中声明的新类 class name 去掉 Layer 后的名称。
上面设定了 key 这个參数的预设值为 12.88。嗯。你想到了刘翔对不正确。
为了检验该 Layer 能否正常创建和运行 forward, backward,我们运行 caffe time 命令并指定刚刚实现的 prototxt :
$ ./build/tools/caffe.bin time -model deploy.prototxt
I1002 02:03:41.667682 1954701312 caffe.cpp:312] Use CPU.
I1002 02:03:41.671360 1954701312 net.cpp:49] Initializing net from parameters:
name: "AllPassTest"
state {
phase: TRAIN
}
layer {
name: "data"
type: "Input"
top: "data"
input_param {
shape {
dim: 10
dim: 3
dim: 227
dim: 227
}
}
}
layer {
name: "ap"
type: "AllPass"
bottom: "data"
top: "conv1"
all_pass_param {
key: 12.88
}
}
I1002 02:03:41.671463 1954701312 layer_factory.hpp:77] Creating layer data
I1002 02:03:41.671484 1954701312 net.cpp:91] Creating Layer data
I1002 02:03:41.671499 1954701312 net.cpp:399] data -> data
I1002 02:03:41.671555 1954701312 net.cpp:141] Setting up data
I1002 02:03:41.671566 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870)
I1002 02:03:41.671592 1954701312 net.cpp:156] Memory required for data: 6183480
I1002 02:03:41.671605 1954701312 layer_factory.hpp:77] Creating layer ap
I1002 02:03:41.671620 1954701312 net.cpp:91] Creating Layer ap
I1002 02:03:41.671630 1954701312 net.cpp:425] ap <- data
I1002 02:03:41.671644 1954701312 net.cpp:399] ap -> conv1
I1002 02:03:41.671663 1954701312 net.cpp:141] Setting up ap
I1002 02:03:41.671674 1954701312 net.cpp:148] Top shape: 10 3 227 227 (1545870)
I1002 02:03:41.671685 1954701312 net.cpp:156] Memory required for data: 12366960
I1002 02:03:41.671695 1954701312 net.cpp:219] ap does not need backward computation.
I1002 02:03:41.671705 1954701312 net.cpp:219] data does not need backward computation.
I1002 02:03:41.671710 1954701312 net.cpp:261] This network produces output conv1
I1002 02:03:41.671720 1954701312 net.cpp:274] Network initialization done.
I1002 02:03:41.671746 1954701312 caffe.cpp:320] Performing Forward
Here is All Pass Layer, forwarding.
12.88
I1002 02:03:41.679689 1954701312 caffe.cpp:325] Initial loss: 0
I1002 02:03:41.679714 1954701312 caffe.cpp:326] Performing Backward
I1002 02:03:41.679738 1954701312 caffe.cpp:334] *** Benchmark begins ***
I1002 02:03:41.679746 1954701312 caffe.cpp:335] Testing for 50 iterations.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.681139 1954701312 caffe.cpp:363] Iteration: 1 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.682394 1954701312 caffe.cpp:363] Iteration: 2 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.683653 1954701312 caffe.cpp:363] Iteration: 3 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.685096 1954701312 caffe.cpp:363] Iteration: 4 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.686326 1954701312 caffe.cpp:363] Iteration: 5 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.687713 1954701312 caffe.cpp:363] Iteration: 6 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.689038 1954701312 caffe.cpp:363] Iteration: 7 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.690251 1954701312 caffe.cpp:363] Iteration: 8 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.691548 1954701312 caffe.cpp:363] Iteration: 9 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.692805 1954701312 caffe.cpp:363] Iteration: 10 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.694056 1954701312 caffe.cpp:363] Iteration: 11 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.695264 1954701312 caffe.cpp:363] Iteration: 12 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.696761 1954701312 caffe.cpp:363] Iteration: 13 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.698225 1954701312 caffe.cpp:363] Iteration: 14 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.699653 1954701312 caffe.cpp:363] Iteration: 15 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.700945 1954701312 caffe.cpp:363] Iteration: 16 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.702761 1954701312 caffe.cpp:363] Iteration: 17 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.704056 1954701312 caffe.cpp:363] Iteration: 18 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.706471 1954701312 caffe.cpp:363] Iteration: 19 forward-backward time: 2 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.708784 1954701312 caffe.cpp:363] Iteration: 20 forward-backward time: 2 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.710043 1954701312 caffe.cpp:363] Iteration: 21 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.711272 1954701312 caffe.cpp:363] Iteration: 22 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.712528 1954701312 caffe.cpp:363] Iteration: 23 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.713964 1954701312 caffe.cpp:363] Iteration: 24 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.715248 1954701312 caffe.cpp:363] Iteration: 25 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.716487 1954701312 caffe.cpp:363] Iteration: 26 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.717725 1954701312 caffe.cpp:363] Iteration: 27 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.718962 1954701312 caffe.cpp:363] Iteration: 28 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.720289 1954701312 caffe.cpp:363] Iteration: 29 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.721837 1954701312 caffe.cpp:363] Iteration: 30 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.723042 1954701312 caffe.cpp:363] Iteration: 31 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.724261 1954701312 caffe.cpp:363] Iteration: 32 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.725587 1954701312 caffe.cpp:363] Iteration: 33 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.726771 1954701312 caffe.cpp:363] Iteration: 34 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.728013 1954701312 caffe.cpp:363] Iteration: 35 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.729249 1954701312 caffe.cpp:363] Iteration: 36 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.730716 1954701312 caffe.cpp:363] Iteration: 37 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.732275 1954701312 caffe.cpp:363] Iteration: 38 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.733809 1954701312 caffe.cpp:363] Iteration: 39 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.735049 1954701312 caffe.cpp:363] Iteration: 40 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.737144 1954701312 caffe.cpp:363] Iteration: 41 forward-backward time: 2 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.739090 1954701312 caffe.cpp:363] Iteration: 42 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.741575 1954701312 caffe.cpp:363] Iteration: 43 forward-backward time: 2 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.743450 1954701312 caffe.cpp:363] Iteration: 44 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.744732 1954701312 caffe.cpp:363] Iteration: 45 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.745970 1954701312 caffe.cpp:363] Iteration: 46 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.747185 1954701312 caffe.cpp:363] Iteration: 47 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.748430 1954701312 caffe.cpp:363] Iteration: 48 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.749826 1954701312 caffe.cpp:363] Iteration: 49 forward-backward time: 1 ms.
Here is All Pass Layer, forwarding.
12.88
Here is All Pass Layer, backwarding.
12.88
I1002 02:03:41.751124 1954701312 caffe.cpp:363] Iteration: 50 forward-backward time: 1 ms.
I1002 02:03:41.751147 1954701312 caffe.cpp:366] Average time per layer:
I1002 02:03:41.751157 1954701312 caffe.cpp:369] data forward: 0.00108 ms.
I1002 02:03:41.751183 1954701312 caffe.cpp:372] data backward: 0.001 ms.
I1002 02:03:41.751194 1954701312 caffe.cpp:369] ap forward: 1.37884 ms.
I1002 02:03:41.751205 1954701312 caffe.cpp:372] ap backward: 0.01156 ms.
I1002 02:03:41.751220 1954701312 caffe.cpp:377] Average Forward pass: 1.38646 ms.
I1002 02:03:41.751231 1954701312 caffe.cpp:379] Average Backward pass: 0.0144 ms.
I1002 02:03:41.751240 1954701312 caffe.cpp:381] Average Forward-Backward: 1.42 ms.
I1002 02:03:41.751250 1954701312 caffe.cpp:383] Total Time: 71 ms.
I1002 02:03:41.751260 1954701312 caffe.cpp:384] *** Benchmark ends ***
可见该 Layer 能够正常创建、载入预设參数、运行 forward、backward 函数。
实际上对于算法 Layer。还要写 Test Case 保证功能正确。因为我们选择了极为简单的全通 Layer,故这一步能够省去。我这里偷点懒,您省点阅读时间。
感谢各位读者提出的宝贵建议和意见,这些都是无价的有监督学习数据集,是激励我不断 update 的 back prop 源动力。
祝各位同学国庆快乐!
【深夜福利】Caffe 添加自己定义 Layer 及其 ProtoBuffer 參数的更多相关文章
- 如何给caffe添加新的layer ?
		
如何给caffe添加新的layer ? 初学caffe难免会遇到这个问题,网上搜来一段看似经典的话, 但是问题来了,貌似新版的caffe并没有上面提到的vision_layer:
 - 各种python 函数參数定义和解析
		
python 中的函数參数是赋值式的传递的,函数的使用中要注意两个方面:1.函数參数的定义过程,2.函数參数在调用过程中是怎样解析的. 首先说一下在python 中的函数调用过程是分四种方式的.这里且 ...
 - SQL 用户定义表类型,在存储过程里使用数据类型作參数
		
在数据库编程里使用数据类型,能够提高代码的重用性.它们常常被使用在方法和存储过程中.使用数据类型,我们能够避免在存储过程里定义一串的參数,让人眼花缭乱,它就相当于面向对象语言里.向一个方法里传入一个对 ...
 - 在caffe中添加新的layer
		
比如现在要添加一个vision layer,名字叫Ly_Layer:(一般命名第一个字母大写,其余小写.) 1.属于哪个类型的layer(共五种:common_layer, data_layer, l ...
 - 怎样在caffe中添加layer以及caffe中triplet loss layer的实现
		
关于triplet loss的原理.目标函数和梯度推导在上一篇博客中已经讲过了.详细见:triplet loss原理以及梯度推导.这篇博文主要是讲caffe下实现triplet loss.编程菜鸟.假 ...
 - caffe添加python数据层
		
caffe添加python数据层(ImageData) 在caffe中添加自定义层时,必须要实现这四个函数,在C++中是(LayerSetUp,Reshape,Forward_cpu,Backward ...
 - caffe添加自己编写的Python层
		
由于Python的灵活性,我们在caffe中添加自己定义的层时使用python层会更加方便,开发速速也会比C++更快,现在我就在这儿简单说一下如何在caffe中添加自定义的python层(使用的原网络 ...
 - 【Caffe代码解析】Layer网络层
		
Layer 功能: 是全部的网络层的基类,当中.定义了一些通用的接口,比方前馈.反馈.reshape,setup等. #ifndef CAFFE_LAYER_H_ #define CAFFE_LAYE ...
 - Linux学习笔记——举例说,makefile 添加宏定义
		
0.前言 从学习C语言開始就慢慢開始接触makefile,查阅了非常多的makefile的资料但总感觉没有真正掌握makefile.假设自己动手写一个makefile总认为非常吃力. 所以特意 ...
 
随机推荐
- Django学习笔记-2018.11.16
			
知识储备: 1 Python基础 2 数据库SQL 3 HTTP协议 4 HTML&&CSS 5 正则表达式 Django启动 django-admin startproject pr ...
 - 用vue-wechat-title为微信动态设置标题
			
1.安装插件 cnpm install vue-wechat-title --save 2.在main.js中引入 Vue.use(require('vue-wechat-title')) 3.在路由 ...
 - Python并发编程-队列
			
队列 IPC = Inter-Process Communication 队列 先进先出 队列的几种方法 #put() #full() #get() #empty() #get-nowait() fr ...
 - 隐马尔可夫模型(Hidden Markov Model)
			
隐马尔可夫模型(Hidden Markov Model) 隐马尔可夫模型(Hidden Markov Model, HMM)是一个重要的机器学习模型.直观地说,它可以解决一类这样的问题:有某样事物存在 ...
 - Docker应用系列(四)| 部署java应用
			
本示例基于Centos 7,假设目前使用的账号为release,拥有sudo权限. 由于Docker官方镜像下载较慢,可以开启阿里云的Docker镜像下载加速器,可参考此文进行配置. 主机上服务安装步 ...
 - FFTW3学习笔记3:FFTW 和 CUFFT 的使用对比
			
一.流程 1.使用cufftHandle创建句柄 2.使用cufftPlan1d(),cufftPlan3d(),cufftPlan3d(),cufftPlanMany()对句柄进行配置,主要是配置句 ...
 - Codeforces 980 D. Perfect Groups
			
\(>Codeforces\space980 D. Perfect Groups<\) 题目大意 : 设 \(F(S)\) 表示在集合\(S\)中把元素划分成若干组,使得每组内元素两两相乘 ...
 - BZOJ 2938 [Poi2000]病毒(AC自动机)
			
[题目链接] http://www.lydsy.com/JudgeOnline/problem.php?id=2938 [题目大意] 给出一些病毒串,问是否存在不包含任何病毒串的无限长的字符串 [题解 ...
 - Java(静态)变量和(静态)代码块的执行顺序
			
本文讨论Java中(静态)变量.(静态)代码块的执行顺序 首先创建3个类: 1.Foo类,用于打印变量 public class Foo { public Foo(String word) { Sys ...
 - [bzoj1024][SCOI2009]生日快乐 (枚举)
			
Description windy的生日到了,为了庆祝生日,他的朋友们帮他买了一 个边长分别为 X 和 Y 的矩形蛋糕.现在包括windy,一共有 N 个人来分这块大蛋糕,要求每个人必须获得相同面积的 ...