日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人工智能 > Caffe >内容正文

Caffe

Caffe 增加自定义 Layer 及其 ProtoBuffer 参数

發布時間:2023/12/4 Caffe 97 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Caffe 增加自定义 Layer 及其 ProtoBuffer 参数 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

轉載自:http://blog.csdn.net/kkk584520/article/details/52721838

http://blog.csdn.net/kkk584520


博客內容基于新書《深度學習:21 天實戰 Caffe》,書中課后習題答案歡迎讀者留言討論。以下進入正文。


在使用 Caffe 過程中經常會有這樣的需求:已有 Layer 不符合我的應用場景;我需要這樣這樣的功能,原版代碼沒有實現;或者已經實現但效率太低,我有更好的實現。


方案一:簡單粗暴的解法——偷天換日


如果你對?ConvolutionLayer 的實現不滿意,那就直接改這兩個文件:$CAFFE_ROOT/include/caffe/layers/conv_layer.hpp 和 $CAFFE_ROOT/src/caffe/layers/conv_layer.cpp 或 conv_layer.cu ,將 im2col + gemm 替換為你自己的實現(比如基于 winograd 算法的實現)。

優點:快速迭代,不需要對 Caffe 框架有過多了解,糙快狠準。

缺點:代碼難維護,不能 merge 到 caffe master branch,容易給使用代碼的人帶來困惑(效果和 #define TRUE false 差不多)。


方案二:稍微溫柔的解法——千人千面

和方案一類似,只是通過預編譯宏來確定使用哪種實現。例如可以保留?ConvolutionLayer 默認實現,同時在代碼中增加如下段:

[cpp]?view plaincopy print?
  • #ifdef?SWITCH_MY_IMPLEMENTATION??
  • //?你的實現代碼??
  • #else??
  • //?默認代碼??
  • #endif??

  • 這樣可以在需要使用該 Layer 的代碼中,增加宏定義:

    [cpp]?view plaincopy print?
  • #define?SWITCH_MY_IMPLEMENTATION??

  • 就可以使用你的實現。而未定義該宏的代碼,仍然使用原版實現。


    優點:可以在新舊實現代碼之間靈活切換;

    缺點:每次切換需要重新編譯;


    方案三:優雅轉身——山路十八彎

    同一個功能的 Layer 有不同實現,希望能靈活切換又不需要重新編譯代碼,該如何實現?

    這時不得不使用 ProtoBuffer 工具了。

    首先,要把你的實現,要像正常的 Layer 類一樣,分解為聲明部分和實現部分,分別放在 .hpp 與 .cpp、.cu 中。Layer 名稱要起一個能區別于原版實現的新名稱。.hpp 文件置于?$CAFFE_ROOT/include/caffe/layers/,而 .cpp 和 .cu 置于?$CAFFE_ROOT/src/caffe/layers/,這樣你在 $CAFFE_ROOT 下執行 make 編譯時,會自動將這些文件加入構建過程,省去了手動設置編譯選項的繁瑣流程。

    其次,在?$CAFFE_ROOT/src/caffe/proto/caffe.proto 中,增加新 LayerParameter 選項,這樣你在編寫 train.prototxt 或者 test.prototxt 或者 deploy.prototxt 時就能把新 Layer 的描述寫進去,便于修改網絡結構和替換其他相同功能的 Layer 了。

    最后也是最容易忽視的一點,在 Layer 工廠注冊新 Layer 加工函數,不然在你運行過程中可能會報如下錯誤:

    [plain]?view plaincopy print?
  • F1002?01:51:22.656038?1954701312?layer_factory.hpp:81]?Check?failed:?registry.count(type)?==?1?(0?vs.?1)?Unknown?layer?type:?AllPass?(known?types:?AbsVal,?Accuracy,?ArgMax,?BNLL,?BatchNorm,?BatchReindex,?Bias,?Concat,?ContrastiveLoss,?Convolution,?Crop,?Data,?Deconvolution,?Dropout,?DummyData,?ELU,?Eltwise,?Embed,?EuclideanLoss,?Exp,?Filter,?Flatten,?HDF5Data,?HDF5Output,?HingeLoss,?Im2col,?ImageData,?InfogainLoss,?InnerProduct,?Input,?LRN,?Log,?MVN,?MemoryData,?MultinomialLogisticLoss,?PReLU,?Pooling,?Power,?ReLU,?Reduction,?Reshape,?SPP,?Scale,?Sigmoid,?SigmoidCrossEntropyLoss,?Silence,?Slice,?Softmax,?SoftmaxWithLoss,?Split,?TanH,?Threshold,?Tile,?WindowData)??
  • ***?Check?failure?stack?trace:?***??
  • ????@????????0x10243154e??google::LogMessage::Fail()??
  • ????@????????0x102430c53??google::LogMessage::SendToLog()??
  • ????@????????0x1024311a9??google::LogMessage::Flush()??
  • ????@????????0x1024344d7??google::LogMessageFatal::~LogMessageFatal()??
  • ????@????????0x10243183b??google::LogMessageFatal::~LogMessageFatal()??
  • ????@????????0x102215356??caffe::LayerRegistry<>::CreateLayer()??
  • ????@????????0x102233ccf??caffe::Net<>::Init()??
  • ????@????????0x102235996??caffe::Net<>::Net()??
  • ????@????????0x102118d8b??time()??
  • ????@????????0x102119c9a??main??
  • ????@?????0x7fff851285ad??start??
  • ????@????????????????0x4??(unknown)??
  • Abort?trap:?6??



  • 下面給出一個實際案例,走一遍方案三的流程。

    這里我們實現一個新 Layer,名稱為 AllPassLayer,顧名思義就是全通 Layer,“全通”借鑒于信號處理中的全通濾波器,將信號無失真地從輸入轉到輸出。

    雖然這個 Layer 并沒有什么卵用,但是在這個基礎上增加你的處理是非常簡單的事情。另外也是出于實驗考慮,全通層的 Forward/Backward 函數非常簡單不需要讀者有任何高等數學和求導的背景知識。讀者使用該層時可以插入到任何已有網絡中,而不會影響訓練、預測的準確性。


    首先看頭文件:

    [cpp]?view plaincopy print?
  • #ifndef?CAFFE_ALL_PASS_LAYER_HPP_??
  • #define?CAFFE_ALL_PASS_LAYER_HPP_??
  • ??
  • #include?<vector>??
  • ??
  • #include?"caffe/blob.hpp"??
  • #include?"caffe/layer.hpp"??
  • #include?"caffe/proto/caffe.pb.h"??
  • ??
  • #include?"caffe/layers/neuron_layer.hpp"??
  • ??
  • namespace?caffe?{??
  • template?<typename?Dtype>??
  • class?AllPassLayer?:?public?NeuronLayer<Dtype>?{??
  • ?public:??
  • ??explicit?AllPassLayer(const?LayerParameter&?param)??
  • ??????:?NeuronLayer<Dtype>(param)?{}??
  • ??
  • ??virtual?inline?const?char*?type()?const?{?return?"AllPass";?}??
  • ??
  • ?protected:??
  • ??
  • ??virtual?void?Forward_cpu(const?vector<Blob<Dtype>*>&?bottom,??
  • ??????const?vector<Blob<Dtype>*>&?top);??
  • ??virtual?void?Forward_gpu(const?vector<Blob<Dtype>*>&?bottom,??
  • ??????const?vector<Blob<Dtype>*>&?top);??
  • ??virtual?void?Backward_cpu(const?vector<Blob<Dtype>*>&?top,??
  • ??????const?vector<bool>&?propagate_down,?const?vector<Blob<Dtype>*>&?bottom);??
  • ??virtual?void?Backward_gpu(const?vector<Blob<Dtype>*>&?top,??
  • ??????const?vector<bool>&?propagate_down,?const?vector<Blob<Dtype>*>&?bottom);??
  • };??
  • ??
  • }??//?namespace?caffe??
  • ??
  • #endif??//?CAFFE_ALL_PASS_LAYER_HPP_??


  • 再看源文件:

    [cpp]?view plaincopy print?
  • #include?<algorithm>??
  • #include?<vector>??
  • ??
  • #include?"caffe/layers/all_pass_layer.hpp"??
  • ??
  • #include?<iostream>??
  • using?namespace?std;??
  • #define?DEBUG_AP(str)?cout<<str<<endl??
  • namespace?caffe?{??
  • ??
  • template?<typename?Dtype>??
  • void?AllPassLayer<Dtype>::Forward_cpu(const?vector<Blob<Dtype>*>&?bottom,??
  • ????const?vector<Blob<Dtype>*>&?top)?{??
  • ??const?Dtype*?bottom_data?=?bottom[0]->cpu_data();??
  • ??Dtype*?top_data?=?top[0]->mutable_cpu_data();??
  • ??const?int?count?=?bottom[0]->count();??
  • ??for?(int?i?=?0;?i?<?count;?++i)?{??
  • ????top_data[i]?=?bottom_data[i];??
  • ??}??
  • ??DEBUG_AP("Here?is?All?Pass?Layer,?forwarding.");??
  • ??DEBUG_AP(this->layer_param_.all_pass_param().key());??
  • }??
  • ??
  • template?<typename?Dtype>??
  • void?AllPassLayer<Dtype>::Backward_cpu(const?vector<Blob<Dtype>*>&?top,??
  • ????const?vector<bool>&?propagate_down,??
  • ????const?vector<Blob<Dtype>*>&?bottom)?{??
  • ??if?(propagate_down[0])?{??
  • ????const?Dtype*?bottom_data?=?bottom[0]->cpu_data();??
  • ????const?Dtype*?top_diff?=?top[0]->cpu_diff();??
  • ????Dtype*?bottom_diff?=?bottom[0]->mutable_cpu_diff();??
  • ????const?int?count?=?bottom[0]->count();??
  • ????for?(int?i?=?0;?i?<?count;?++i)?{??
  • ??????bottom_diff[i]?=?top_diff[i];??
  • ????}??
  • ??}??
  • ??DEBUG_AP("Here?is?All?Pass?Layer,?backwarding.");??
  • ??DEBUG_AP(this->layer_param_.all_pass_param().key());??
  • }??
  • ??
  • ??
  • #ifdef?CPU_ONLY??
  • STUB_GPU(AllPassLayer);??
  • #endif??
  • ??
  • INSTANTIATE_CLASS(AllPassLayer);??
  • REGISTER_LAYER_CLASS(AllPass);??
  • }??//?namespace?caffe??


  • 時間考慮,我沒有實現 GPU 模式的 forward、backward,故本文例程僅支持 CPU_ONLY 模式。


    編輯 caffe.proto,找到 LayerParameter 描述,增加一項:

    [cpp]?view plaincopy print?
  • message?LayerParameter?{??
  • ??optional?string?name?=?1;?//?the?layer?name??
  • ??optional?string?type?=?2;?//?the?layer?type??
  • ??repeated?string?bottom?=?3;?//?the?name?of?each?bottom?blob??
  • ??repeated?string?top?=?4;?//?the?name?of?each?top?blob??
  • ??
  • ??//?The?train?/?test?phase?for?computation.??
  • ??optional?Phase?phase?=?10;??
  • ??
  • ??//?The?amount?of?weight?to?assign?each?top?blob?in?the?objective.??
  • ??//?Each?layer?assigns?a?default?value,?usually?of?either?0?or?1,??
  • ??//?to?each?top?blob.??
  • ??repeated?float?loss_weight?=?5;??
  • ??
  • ??//?Specifies?training?parameters?(multipliers?on?global?learning?constants,??
  • ??//?and?the?name?and?other?settings?used?for?weight?sharing).??
  • ??repeated?ParamSpec?param?=?6;??
  • ??
  • ??//?The?blobs?containing?the?numeric?parameters?of?the?layer.??
  • ??repeated?BlobProto?blobs?=?7;??
  • ??
  • ??//?Specifies?on?which?bottoms?the?backpropagation?should?be?skipped.??
  • ??//?The?size?must?be?either?0?or?equal?to?the?number?of?bottoms.??
  • ??repeated?bool?propagate_down?=?11;??
  • ??
  • ??//?Rules?controlling?whether?and?when?a?layer?is?included?in?the?network,??
  • ??//?based?on?the?current?NetState.??You?may?specify?a?non-zero?number?of?rules??
  • ??//?to?include?OR?exclude,?but?not?both.??If?no?include?or?exclude?rules?are??
  • ??//?specified,?the?layer?is?always?included.??If?the?current?NetState?meets??
  • ??//?ANY?(i.e.,?one?or?more)?of?the?specified?rules,?the?layer?is??
  • ??//?included/excluded.??
  • ??repeated?NetStateRule?include?=?8;??
  • ??repeated?NetStateRule?exclude?=?9;??
  • ??
  • ??//?Parameters?for?data?pre-processing.??
  • ??optional?TransformationParameter?transform_param?=?100;??
  • ??
  • ??//?Parameters?shared?by?loss?layers.??
  • ??optional?LossParameter?loss_param?=?101;??
  • ??
  • ??//?Layer?type-specific?parameters.??
  • ??//??
  • ??//?Note:?certain?layers?may?have?more?than?one?computational?engine??
  • ??//?for?their?implementation.?These?layers?include?an?Engine?type?and??
  • ??//?engine?parameter?for?selecting?the?implementation.??
  • ??//?The?default?for?the?engine?is?set?by?the?ENGINE?switch?at?compile-time.??
  • ??optional?AccuracyParameter?accuracy_param?=?102;??
  • ??optional?ArgMaxParameter?argmax_param?=?103;??
  • ??optional?BatchNormParameter?batch_norm_param?=?139;??
  • ??optional?BiasParameter?bias_param?=?141;??
  • ??optional?ConcatParameter?concat_param?=?104;??
  • ??optional?ContrastiveLossParameter?contrastive_loss_param?=?105;??
  • ??optional?ConvolutionParameter?convolution_param?=?106;??
  • ??optional?CropParameter?crop_param?=?144;??
  • ??optional?DataParameter?data_param?=?107;??
  • ??optional?DropoutParameter?dropout_param?=?108;??
  • ??optional?DummyDataParameter?dummy_data_param?=?109;??
  • ??optional?EltwiseParameter?eltwise_param?=?110;??
  • ??optional?ELUParameter?elu_param?=?140;??
  • ??optional?EmbedParameter?embed_param?=?137;??
  • ??optional?ExpParameter?exp_param?=?111;??
  • ??optional?FlattenParameter?flatten_param?=?135;??
  • ??optional?HDF5DataParameter?hdf5_data_param?=?112;??
  • ??optional?HDF5OutputParameter?hdf5_output_param?=?113;??
  • ??optional?HingeLossParameter?hinge_loss_param?=?114;??
  • ??optional?ImageDataParameter?image_data_param?=?115;??
  • ??optional?InfogainLossParameter?infogain_loss_param?=?116;??
  • ??optional?InnerProductParameter?inner_product_param?=?117;??
  • ??optional?InputParameter?input_param?=?143;??
  • ??optional?LogParameter?log_param?=?134;??
  • ??optional?LRNParameter?lrn_param?=?118;??
  • ??optional?MemoryDataParameter?memory_data_param?=?119;??
  • ??optional?MVNParameter?mvn_param?=?120;??
  • ??optional?PoolingParameter?pooling_param?=?121;??
  • ??optional?PowerParameter?power_param?=?122;??
  • ??optional?PReLUParameter?prelu_param?=?131;??
  • ??optional?PythonParameter?python_param?=?130;??
  • ??optional?ReductionParameter?reduction_param?=?136;??
  • ??optional?ReLUParameter?relu_param?=?123;??
  • ??optional?ReshapeParameter?reshape_param?=?133;??
  • ??optional?ScaleParameter?scale_param?=?142;??
  • ??optional?SigmoidParameter?sigmoid_param?=?124;??
  • ??optional?SoftmaxParameter?softmax_param?=?125;??
  • ??optional?SPPParameter?spp_param?=?132;??
  • ??optional?SliceParameter?slice_param?=?126;??
  • ??optional?TanHParameter?tanh_param?=?127;??
  • ??optional?ThresholdParameter?threshold_param?=?128;??
  • ??optional?TileParameter?tile_param?=?138;??
  • ??optional?WindowDataParameter?window_data_param?=?129;??
  • ??optional?AllPassParameter?all_pass_param?=?155;??
  • }??

  • 注意新增數字不要和以前的 Layer 數字重復。


    仍然在 caffe.proto 中,增加 AllPassParameter 聲明,位置任意。我設定了一個參數,可以用于從 prototxt 中讀取預設值。


    [cpp]?view plaincopy print?
  • message?AllPassParameter?{??
  • ??optional?float?key?=?1?[default?=?0];??
  • }??

  • 在 cpp 代碼中,通過

    [cpp]?view plaincopy print?
  • this->layer_param_.all_pass_param().key()??
  • 這句來讀取 prototxt 預設值。

    在 $CAFFE_ROOT 下執行 make clean,然后重新 make all。要想一次編譯成功,務必規范代碼,對常見錯誤保持敏銳的嗅覺并加以避免。


    萬事具備,只欠 prototxt 了。


    不難,我們寫個最簡單的 deploy.prototxt,不需要 data layer 和 softmax layer,just for fun。

    [cpp]?view plaincopy print?
  • name:?"AllPassTest"??
  • layer?{??
  • ??name:?"data"??
  • ??type:?"Input"??
  • ??top:?"data"??
  • ??input_param?{?shape:?{?dim:?10?dim:?3?dim:?227?dim:?227?}?}??
  • }??
  • layer?{??
  • ??name:?"ap"??
  • ??type:?"AllPass"??
  • ??bottom:?"data"??
  • ??top:?"conv1"??
  • ??all_pass_param?{??
  • ????key:?12.88??
  • ??}??
  • }??


  • 注意,這里的 type :后面寫的內容,應該是你在 .hpp 中聲明的新類 class name 去掉 Layer 后的名稱。

    上面設定了 key 這個參數的預設值為 12.88,嗯,你想到了劉翔對不對。


    為了檢驗該 Layer 是否能正常創建和執行 ?forward, backward,我們運行 caffe time 命令并指定剛剛實現的 prototxt :

    [plain]?view plaincopy print?
  • $?./build/tools/caffe.bin?time?-model?deploy.prototxt??
  • I1002?02:03:41.667682?1954701312?caffe.cpp:312]?Use?CPU.??
  • I1002?02:03:41.671360?1954701312?net.cpp:49]?Initializing?net?from?parameters:??
  • name:?"AllPassTest"??
  • state?{??
  • ??phase:?TRAIN??
  • }??
  • layer?{??
  • ??name:?"data"??
  • ??type:?"Input"??
  • ??top:?"data"??
  • ??input_param?{??
  • ????shape?{??
  • ??????dim:?10??
  • ??????dim:?3??
  • ??????dim:?227??
  • ??????dim:?227??
  • ????}??
  • ??}??
  • }??
  • layer?{??
  • ??name:?"ap"??
  • ??type:?"AllPass"??
  • ??bottom:?"data"??
  • ??top:?"conv1"??
  • ??all_pass_param?{??
  • ????key:?12.88??
  • ??}??
  • }??
  • I1002?02:03:41.671463?1954701312?layer_factory.hpp:77]?Creating?layer?data??
  • I1002?02:03:41.671484?1954701312?net.cpp:91]?Creating?Layer?data??
  • I1002?02:03:41.671499?1954701312?net.cpp:399]?data?->?data??
  • I1002?02:03:41.671555?1954701312?net.cpp:141]?Setting?up?data??
  • I1002?02:03:41.671566?1954701312?net.cpp:148]?Top?shape:?10?3?227?227?(1545870)??
  • I1002?02:03:41.671592?1954701312?net.cpp:156]?Memory?required?for?data:?6183480??
  • I1002?02:03:41.671605?1954701312?layer_factory.hpp:77]?Creating?layer?ap??
  • I1002?02:03:41.671620?1954701312?net.cpp:91]?Creating?Layer?ap??
  • I1002?02:03:41.671630?1954701312?net.cpp:425]?ap?<-?data??
  • I1002?02:03:41.671644?1954701312?net.cpp:399]?ap?->?conv1??
  • I1002?02:03:41.671663?1954701312?net.cpp:141]?Setting?up?ap??
  • I1002?02:03:41.671674?1954701312?net.cpp:148]?Top?shape:?10?3?227?227?(1545870)??
  • I1002?02:03:41.671685?1954701312?net.cpp:156]?Memory?required?for?data:?12366960??
  • I1002?02:03:41.671695?1954701312?net.cpp:219]?ap?does?not?need?backward?computation.??
  • I1002?02:03:41.671705?1954701312?net.cpp:219]?data?does?not?need?backward?computation.??
  • I1002?02:03:41.671710?1954701312?net.cpp:261]?This?network?produces?output?conv1??
  • I1002?02:03:41.671720?1954701312?net.cpp:274]?Network?initialization?done.??
  • I1002?02:03:41.671746?1954701312?caffe.cpp:320]?Performing?Forward??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • I1002?02:03:41.679689?1954701312?caffe.cpp:325]?Initial?loss:?0??
  • I1002?02:03:41.679714?1954701312?caffe.cpp:326]?Performing?Backward??
  • I1002?02:03:41.679738?1954701312?caffe.cpp:334]?***?Benchmark?begins?***??
  • I1002?02:03:41.679746?1954701312?caffe.cpp:335]?Testing?for?50?iterations.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.681139?1954701312?caffe.cpp:363]?Iteration:?1?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.682394?1954701312?caffe.cpp:363]?Iteration:?2?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.683653?1954701312?caffe.cpp:363]?Iteration:?3?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.685096?1954701312?caffe.cpp:363]?Iteration:?4?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.686326?1954701312?caffe.cpp:363]?Iteration:?5?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.687713?1954701312?caffe.cpp:363]?Iteration:?6?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.689038?1954701312?caffe.cpp:363]?Iteration:?7?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.690251?1954701312?caffe.cpp:363]?Iteration:?8?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.691548?1954701312?caffe.cpp:363]?Iteration:?9?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.692805?1954701312?caffe.cpp:363]?Iteration:?10?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.694056?1954701312?caffe.cpp:363]?Iteration:?11?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.695264?1954701312?caffe.cpp:363]?Iteration:?12?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.696761?1954701312?caffe.cpp:363]?Iteration:?13?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.698225?1954701312?caffe.cpp:363]?Iteration:?14?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.699653?1954701312?caffe.cpp:363]?Iteration:?15?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.700945?1954701312?caffe.cpp:363]?Iteration:?16?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.702761?1954701312?caffe.cpp:363]?Iteration:?17?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.704056?1954701312?caffe.cpp:363]?Iteration:?18?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.706471?1954701312?caffe.cpp:363]?Iteration:?19?forward-backward?time:?2?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.708784?1954701312?caffe.cpp:363]?Iteration:?20?forward-backward?time:?2?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.710043?1954701312?caffe.cpp:363]?Iteration:?21?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.711272?1954701312?caffe.cpp:363]?Iteration:?22?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.712528?1954701312?caffe.cpp:363]?Iteration:?23?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.713964?1954701312?caffe.cpp:363]?Iteration:?24?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.715248?1954701312?caffe.cpp:363]?Iteration:?25?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.716487?1954701312?caffe.cpp:363]?Iteration:?26?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.717725?1954701312?caffe.cpp:363]?Iteration:?27?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.718962?1954701312?caffe.cpp:363]?Iteration:?28?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.720289?1954701312?caffe.cpp:363]?Iteration:?29?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.721837?1954701312?caffe.cpp:363]?Iteration:?30?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.723042?1954701312?caffe.cpp:363]?Iteration:?31?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.724261?1954701312?caffe.cpp:363]?Iteration:?32?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.725587?1954701312?caffe.cpp:363]?Iteration:?33?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.726771?1954701312?caffe.cpp:363]?Iteration:?34?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.728013?1954701312?caffe.cpp:363]?Iteration:?35?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.729249?1954701312?caffe.cpp:363]?Iteration:?36?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.730716?1954701312?caffe.cpp:363]?Iteration:?37?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.732275?1954701312?caffe.cpp:363]?Iteration:?38?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.733809?1954701312?caffe.cpp:363]?Iteration:?39?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.735049?1954701312?caffe.cpp:363]?Iteration:?40?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.737144?1954701312?caffe.cpp:363]?Iteration:?41?forward-backward?time:?2?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.739090?1954701312?caffe.cpp:363]?Iteration:?42?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.741575?1954701312?caffe.cpp:363]?Iteration:?43?forward-backward?time:?2?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.743450?1954701312?caffe.cpp:363]?Iteration:?44?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.744732?1954701312?caffe.cpp:363]?Iteration:?45?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.745970?1954701312?caffe.cpp:363]?Iteration:?46?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.747185?1954701312?caffe.cpp:363]?Iteration:?47?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.748430?1954701312?caffe.cpp:363]?Iteration:?48?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.749826?1954701312?caffe.cpp:363]?Iteration:?49?forward-backward?time:?1?ms.??
  • Here?is?All?Pass?Layer,?forwarding.??
  • 12.88??
  • Here?is?All?Pass?Layer,?backwarding.??
  • 12.88??
  • I1002?02:03:41.751124?1954701312?caffe.cpp:363]?Iteration:?50?forward-backward?time:?1?ms.??
  • I1002?02:03:41.751147?1954701312?caffe.cpp:366]?Average?time?per?layer:??
  • I1002?02:03:41.751157?1954701312?caffe.cpp:369]???????data??forward:?0.00108?ms.??
  • I1002?02:03:41.751183?1954701312?caffe.cpp:372]???????data??backward:?0.001?ms.??
  • I1002?02:03:41.751194?1954701312?caffe.cpp:369]?????????ap??forward:?1.37884?ms.??
  • I1002?02:03:41.751205?1954701312?caffe.cpp:372]?????????ap??backward:?0.01156?ms.??
  • I1002?02:03:41.751220?1954701312?caffe.cpp:377]?Average?Forward?pass:?1.38646?ms.??
  • I1002?02:03:41.751231?1954701312?caffe.cpp:379]?Average?Backward?pass:?0.0144?ms.??
  • I1002?02:03:41.751240?1954701312?caffe.cpp:381]?Average?Forward-Backward:?1.42?ms.??
  • I1002?02:03:41.751250?1954701312?caffe.cpp:383]?Total?Time:?71?ms.??
  • I1002?02:03:41.751260?1954701312?caffe.cpp:384]?***?Benchmark?ends?***??

  • 可見該 Layer 可以正常創建、加載預設參數、執行 forward、backward 函數。

    總結

    以上是生活随笔為你收集整理的Caffe 增加自定义 Layer 及其 ProtoBuffer 参数的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。