日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 人工智能 > Caffe >内容正文

Caffe

Caffe学习系列(17):模型各层数据和参数可视化

發(fā)布時(shí)間:2024/9/21 Caffe 198 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Caffe学习系列(17):模型各层数据和参数可视化 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

先用caffe對cifar10進(jìn)行訓(xùn)練,將訓(xùn)練的結(jié)果模型進(jìn)行保存,得到一個(gè)caffemodel,然后從測試圖片中選出一張進(jìn)行測試,并進(jìn)行可視化。

In?[1]:#加載必要的庫 import numpy as np import matplotlib.pyplot as plt %matplotlib inline import sys,os,caffe In?[2]:#設(shè)置當(dāng)前目錄,判斷模型是否訓(xùn)練好 caffe_root = '/home/bnu/caffe/' sys.path.insert(0, caffe_root + 'python') os.chdir(caffe_root) if not os.path.isfile(caffe_root + 'examples/cifar10/cifar10_quick_iter_4000.caffemodel'):print("caffemodel is not exist...") In?[3]:#利用提前訓(xùn)練好的模型,設(shè)置測試網(wǎng)絡(luò) caffe.set_mode_gpu() net = caffe.Net(caffe_root + 'examples/cifar10/cifar10_quick.prototxt',caffe_root + 'examples/cifar10/cifar10_quick_iter_4000.caffemodel',caffe.TEST) In?[4]:net.blobs['data'].data.shape Out[4]:(1, 3, 32, 32) In?[5]:#加載測試圖片,并顯示 im = caffe.io.load_image('examples/images/32.jpg') print im.shape plt.imshow(im) plt.axis('off') (32, 32, 3) Out[5]:(-0.5, 31.5, 31.5, -0.5) In?[6]:# 編寫一個(gè)函數(shù),將二進(jìn)制的均值轉(zhuǎn)換為python的均值 def convert_mean(binMean,npyMean):blob = caffe.proto.caffe_pb2.BlobProto()bin_mean = open(binMean, 'rb' ).read()blob.ParseFromString(bin_mean)arr = np.array( caffe.io.blobproto_to_array(blob) )npy_mean = arr[0]np.save(npyMean, npy_mean ) binMean=caffe_root+'examples/cifar10/mean.binaryproto' npyMean=caffe_root+'examples/cifar10/mean.npy' convert_mean(binMean,npyMean) In?[7]:#將圖片載入blob中,并減去均值 transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape}) transformer.set_transpose('data', (2,0,1)) transformer.set_mean('data', np.load(npyMean).mean(1).mean(1)) # 減去均值 transformer.set_raw_scale('data', 255) transformer.set_channel_swap('data', (2,1,0)) net.blobs['data'].data[...] = transformer.preprocess('data',im) inputData=net.blobs['data'].data In?[8]:#顯示減去均值前后的數(shù)據(jù) plt.figure() plt.subplot(1,2,1),plt.title("origin") plt.imshow(im) plt.axis('off') plt.subplot(1,2,2),plt.title("subtract mean") plt.imshow(transformer.deprocess('data', inputData[0])) plt.axis('off') Out[8]:(-0.5, 31.5, 31.5, -0.5) In?[9]:#運(yùn)行測試模型,并顯示各層數(shù)據(jù)信息 net.forward() [(k, v.data.shape) for k, v in net.blobs.items()] Out[9]:[('data', (1, 3, 32, 32)),('conv1', (1, 32, 32, 32)),('pool1', (1, 32, 16, 16)),('conv2', (1, 32, 16, 16)),('pool2', (1, 32, 8, 8)),('conv3', (1, 64, 8, 8)),('pool3', (1, 64, 4, 4)),('ip1', (1, 64)),('ip2', (1, 10)),('prob', (1, 10))] In?[10]:#顯示各層的參數(shù)信息 [(k, v[0].data.shape) for k, v in net.params.items()] Out[10]:[('conv1', (32, 3, 5, 5)),('conv2', (32, 32, 5, 5)),('conv3', (64, 32, 5, 5)),('ip1', (64, 1024)),('ip2', (10, 64))] In?[11]:# 編寫一個(gè)函數(shù),用于顯示各層數(shù)據(jù) def show_data(data, padsize=1, padval=0):data -= data.min()data /= data.max()# force the number of filters to be squaren = int(np.ceil(np.sqrt(data.shape[0])))padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))# tile the filters into an imagedata = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])plt.figure()plt.imshow(data,cmap='gray')plt.axis('off') plt.rcParams['figure.figsize'] = (8, 8) plt.rcParams['image.interpolation'] = 'nearest' plt.rcParams['image.cmap'] = 'gray' In?[12]:#顯示第一個(gè)卷積層的輸出數(shù)據(jù)和權(quán)值(filter) show_data(net.blobs['conv1'].data[0]) print net.blobs['conv1'].data.shape show_data(net.params['conv1'][0].data.reshape(32*3,5,5)) print net.params['conv1'][0].data.shape (1, 32, 32, 32) (32, 3, 5, 5) In?[13]:#顯示第一次pooling后的輸出數(shù)據(jù) show_data(net.blobs['pool1'].data[0]) net.blobs['pool1'].data.shape Out[13]:(1, 32, 16, 16) In?[14]:#顯示第二次卷積后的輸出數(shù)據(jù)以及相應(yīng)的權(quán)值(filter) show_data(net.blobs['conv2'].data[0],padval=0.5) print net.blobs['conv2'].data.shape show_data(net.params['conv2'][0].data.reshape(32**2,5,5)) print net.params['conv2'][0].data.shape (1, 32, 16, 16) (32, 32, 5, 5) In?[15]:#顯示第三次卷積后的輸出數(shù)據(jù)以及相應(yīng)的權(quán)值(filter),取前1024個(gè)進(jìn)行顯示 show_data(net.blobs['conv3'].data[0],padval=0.5) print net.blobs['conv3'].data.shape show_data(net.params['conv3'][0].data.reshape(64*32,5,5)[:1024]) print net.params['conv3'][0].data.shape (1, 64, 8, 8) (64, 32, 5, 5) In?[16]:#顯示第三次池化后的輸出數(shù)據(jù) show_data(net.blobs['pool3'].data[0],padval=0.2) print net.blobs['pool3'].data.shape (1, 64, 4, 4) In?[17]:# 最后一層輸入屬于某個(gè)類的概率 feat = net.blobs['prob'].data[0] print feat plt.plot(feat.flat) [ 5.21440245e-03 1.58397834e-05 3.71246301e-02 2.28459597e-011.08315737e-03 7.17785358e-01 1.91939052e-03 7.67927198e-036.13298907e-04 1.05107691e-04] Out[17]:[<matplotlib.lines.Line2D at 0x7f3d882b00d0>]

從輸入的結(jié)果和圖示來看,最大的概率是7.17785358e-01,屬于第5類(標(biāo)號從0開始)。與cifar10中的10種類型名稱進(jìn)行對比:

airplane、automobile、bird、cat、deer、dog、frog、horse、ship、truck

根據(jù)測試結(jié)果,判斷為dog。 測試無誤!

總結(jié)

以上是生活随笔為你收集整理的Caffe学习系列(17):模型各层数据和参数可视化的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。