日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

吴恩达作业8:三层神经网络实现手势数字的识别(基于tensorflow)

發布時間:2024/7/23 编程问答 20 豆豆
生活随笔 收集整理的這篇文章主要介紹了 吴恩达作业8:三层神经网络实现手势数字的识别(基于tensorflow) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

數據集的載入,隨機產生mini-batch放在tf_utils.py,代碼如下

import h5py import numpy as np import tensorflow as tf import mathdef load_dataset():train_dataset = h5py.File('datasets/train_signs.h5', "r")train_set_x_orig = np.array(train_dataset["train_set_x"][:]) # your train set featurestrain_set_y_orig = np.array(train_dataset["train_set_y"][:]) # your train set labelstest_dataset = h5py.File('datasets/test_signs.h5', "r")test_set_x_orig = np.array(test_dataset["test_set_x"][:]) # your test set featurestest_set_y_orig = np.array(test_dataset["test_set_y"][:]) # your test set labelsclasses = np.array(test_dataset["list_classes"][:]) # the list of classestrain_set_y_orig = train_set_y_orig.reshape((1, train_set_y_orig.shape[0]))test_set_y_orig = test_set_y_orig.reshape((1, test_set_y_orig.shape[0]))return train_set_x_orig, train_set_y_orig, test_set_x_orig, test_set_y_orig, classesdef random_mini_batches(X, Y, mini_batch_size, seed = 0):"""Creates a list of random minibatches from (X, Y)Arguments:X -- input data, of shape (input size, number of examples)Y -- true "label" vector (containing 0 if cat, 1 if non-cat), of shape (1, number of examples)mini_batch_size - size of the mini-batches, integerseed -- this is only for the purpose of grading, so that you're "random minibatches are the same as ours.Returns:mini_batches -- list of synchronous (mini_batch_X, mini_batch_Y)"""m = X.shape[1] # number of training examplesmini_batches = []np.random.seed(seed)# Step 1: Shuffle (X, Y)permutation = list(np.random.permutation(m))shuffled_X = X[:, permutation]shuffled_Y = Y[:, permutation]#.reshape((Y.shape[0],m))# Step 2: Partition (shuffled_X, shuffled_Y). Minus the end case.num_complete_minibatches = math.floor(m/mini_batch_size) # number of mini batches of size mini_batch_size in your partitionningfor k in range(0, num_complete_minibatches):mini_batch_X = shuffled_X[:, k * mini_batch_size : k * mini_batch_size + mini_batch_size]mini_batch_Y = shuffled_Y[:, k * mini_batch_size : k * mini_batch_size + mini_batch_size]mini_batch = (mini_batch_X, mini_batch_Y)mini_batches.append(mini_batch)# Handling the end case (last mini-batch < mini_batch_size)if m % mini_batch_size != 0:mini_batch_X = shuffled_X[:, num_complete_minibatches * mini_batch_size : m]mini_batch_Y = shuffled_Y[:, num_complete_minibatches * mini_batch_size : m]mini_batch = (mini_batch_X, mini_batch_Y)mini_batches.append(mini_batch)return mini_batchesdef convert_to_one_hot(Y, C):##Y.reshape(-1) 變成一行Y = np.eye(C)[Y.reshape(-1)].Treturn Ydef predict(X, parameters):W1 = tf.convert_to_tensor(parameters["W1"])b1 = tf.convert_to_tensor(parameters["b1"])W2 = tf.convert_to_tensor(parameters["W2"])b2 = tf.convert_to_tensor(parameters["b2"])W3 = tf.convert_to_tensor(parameters["W3"])b3 = tf.convert_to_tensor(parameters["b3"])params = {"W1": W1,"b1": b1,"W2": W2,"b2": b2,"W3": W3,"b3": b3}x = tf.placeholder("float", [12288, 1])z3 = forward_propagation_for_predict(x, params)p = tf.argmax(z3)sess = tf.Session()prediction = sess.run(p, feed_dict = {x: X})return predictiondef forward_propagation_for_predict(X, parameters):"""Implements the forward propagation for the model: LINEAR -> RELU -> LINEAR -> RELU -> LINEAR -> SOFTMAXArguments:X -- input dataset placeholder, of shape (input size, number of examples)parameters -- python dictionary containing your parameters "W1", "b1", "W2", "b2", "W3", "b3"the shapes are given in initialize_parametersReturns:Z3 -- the output of the last LINEAR unit"""# Retrieve the parameters from the dictionary "parameters" W1 = parameters['W1']b1 = parameters['b1']W2 = parameters['W2']b2 = parameters['b2']W3 = parameters['W3']b3 = parameters['b3'] # Numpy Equivalents:Z1 = tf.add(tf.matmul(W1, X), b1) # Z1 = np.dot(W1, X) + b1A1 = tf.nn.relu(Z1) # A1 = relu(Z1)Z2 = tf.add(tf.matmul(W2, A1), b2) # Z2 = np.dot(W2, a1) + b2A2 = tf.nn.relu(Z2) # A2 = relu(Z2)Z3 = tf.add(tf.matmul(W3, A2), b3) # Z3 = np.dot(W3,Z2) + b3return Z3

首先看數據集:

import tf_utils import cv2 train_set_x_orig, train_set_Y, test_set_x_orig, test_set_Y, classes = tf_utils.load_dataset() print('訓練樣本={}'.format(train_set_x_orig.shape)) print('訓練樣本標簽={}'.format(train_set_Y.shape)) print('測試樣本={}'.format(test_set_x_orig.shape)) print('測試樣本標簽={}'.format(test_set_Y.shape)) print('第五個樣本={}'.format(train_set_Y[0,5])) cv2.imshow('1.jpg',train_set_x_orig[5,:,:,:]) cv2.waitKey()

打印結果:可看出1080個訓練樣本,size為(64,64,3),其中手勢數字用相應的數字代表,故后面要處理成one-hot(samples,6)

利用三層神經網絡,W1=(25,64*64*3),W2=(12,25),W1=(6,12),輸入X=(64*64*3,samples),最終y_pred=(6,samples),做一個轉置與給定的真實y做損失,代碼如下:

import tensorflow as tf import numpy as np import matplotlib.pyplot as plt import tf_utils import cv2 """ 創建 placeholder """ def create_placeholder(n_x,n_y):X=tf.placeholder(tf.float32,shape=[n_x,None],name='X')Y = tf.placeholder(tf.float32, shape=[n_y, None], name='Y')return X,Y """ 初始化權重 """ def initialize_parameters():tf.set_random_seed(1)W1=tf.get_variable(name='W1',shape=[25,12288],dtype=tf.float32,initializer=tf.contrib.layers.xavier_initializer(seed=1))b1 = tf.get_variable(name='b1', shape=[25, 1], dtype=tf.float32,initializer=tf.zeros_initializer())W2 = tf.get_variable(name='W2', shape=[12, 25], dtype=tf.float32,initializer=tf.contrib.layers.xavier_initializer(seed=1))b2 = tf.get_variable(name='b2', shape=[12, 1], dtype=tf.float32,initializer=tf.zeros_initializer())W3 = tf.get_variable(name='W3', shape=[6, 12], dtype=tf.float32,initializer=tf.contrib.layers.xavier_initializer(seed=1))b3 = tf.get_variable(name='b3', shape=[6, 1], dtype=tf.float32,initializer=tf.zeros_initializer())parameters={'W1': W1,'b1': b1,'W2': W2,'b2': b2,'W3': W3,'b3': b3}return parameters """ one-hot編碼 """ def convert_one_hot(Y,C):one_hot=np.eye(C)[Y.reshape(-1)].Treturn one_hot """ 前向傳播 """ def forward_propagation(X,parameters):W1 = parameters['W1']b1 = parameters['b1']W2 = parameters['W2']b2 = parameters['b2']W3 = parameters['W3']b3 = parameters['b3']Z1=tf.add(tf.matmul(W1,X),b1)A1=tf.nn.relu(Z1)Z2 = tf.add(tf.matmul(W2, A1) , b2)A2 = tf.nn.relu(Z2)Z3 = tf.add(tf.matmul(W3, A2) , b3)return Z3 """ 計算損失值 """ def compute_cost(Z3,Y):Z_input=tf.transpose(Z3) ##轉置Y = tf.transpose(Y) ####tf.nn.softmax_cross_entropy_w 要求shape是(number of examples,num_class)cost=tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=Z_input,labels=Y))return cost """ 構建模型 """ def model(train_X,train_Y,test_X,test_Y,learning_rate,num_pochs,minibatch_size):tf.set_random_seed(1)seed=3(n_x,m)=train_X.shape #(12288,1080)costs=[]n_y=train_Y.shape[0] #(6,1080)X, Y = create_placeholder(n_x, n_y)parameters = initialize_parameters()Z3 = forward_propagation(X, parameters)#print(Z3)cost = compute_cost(Z3, Y)optimizer=tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)init = tf.global_variables_initializer()sess = tf.Session()sess.run(init)for i in range(num_pochs):epoch_cost=0mini_batches=tf_utils.random_mini_batches(train_X,train_Y,minibatch_size,seed)num_minibatches=int(m/minibatch_size)for mini_batche in mini_batches:(mini_batche_X,mini_batche_Y)=mini_batche_,temp_cost=sess.run([optimizer,cost],feed_dict={X:mini_batche_X,Y:mini_batche_Y})epoch_cost += temp_cost / num_minibatchesif i%100==0:#cost=sess.run(cost,feed_dict={X:mini_batche_X,Y:mini_batche_Y})print('after {} iterations minibatch_cost={}'.format(i,epoch_cost))costs.append(epoch_cost)plt.plot(costs)plt.xlabel('iterations')plt.ylabel('cost')plt.title('learning_rate={}'.format(learning_rate))plt.show()parameters=sess.run(parameters)#print('parameters={}'.format(parameters))correct_prediction=tf.equal(tf.argmax(Z3,0),tf.argmax(Y,0))##0 代表按列取索引最大值 1代表行索引最大值accuarcy=tf.reduce_mean(tf.cast(correct_prediction,'float'))print('train accuarcy is',sess.run(accuarcy,feed_dict={X: train_X,Y: train_Y}))print('test accuarcy is ',sess.run(accuarcy,feed_dict={X: test_X, Y: test_Y}))return parameters """ 測試模型 """ def test_model():train_set_x_orig, train_set_Y, test_set_x_orig, test_set_Y, classes = tf_utils.load_dataset()train_set_x_flatten = train_set_x_orig.reshape(train_set_x_orig.shape[0],train_set_x_orig.shape[1] * train_set_x_orig.shape[2] * 3).Ttest_set_x_flatten = test_set_x_orig.reshape(test_set_x_orig.shape[0],test_set_x_orig.shape[1] * test_set_x_orig.shape[2] * 3).Ttrain_X = train_set_x_flatten / 255 #(12288,1080)test_X = test_set_x_flatten / 255train_Y = convert_one_hot(train_set_Y,6)#(6,1080)#print('train_y',train_Y.shape)test_Y = convert_one_hot(test_set_Y, 6)parameters=model(train_X, train_Y, test_X, test_Y, learning_rate=0.0001, num_pochs=1000, minibatch_size=32)img = cv2.imread('thumbs_up.jpg')imgsize = cv2.resize(img, (64, 64), interpolation=cv2.INTER_CUBIC).reshape(1,64*64*3).Tcv2.imshow('imgsize', imgsize)image_predict=tf_utils.predict(imgsize,parameters)print(image_predict) if __name__ == '__main__':test_model()

打印結果:

下圖的預測結果是1? 符合

?

?

總結

以上是生活随笔為你收集整理的吴恩达作业8:三层神经网络实现手势数字的识别(基于tensorflow)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 性开放网站 | 婷婷在线视频 | 色偷偷在线观看 | 小罗莉极品一线天在线 | 69av一区二区三区 | 91av福利视频 | 一级二级三级黄色片 | 美美女高清毛片视频免费观看 | 亚洲人在线观看视频 | 一区二区不卡在线 | 国产sm调教一区二区 | 九九视频免费在线观看 | 精品动漫3d一区二区三区免费版 | 日韩欧美99 | 91二区| 免费拍拍拍网站 | 黄色片不卡 | 国产剧情av在线 | 一级肉体全黄裸片中国 | 有码中文 | 亚洲欧美在线看 | 无码人妻aⅴ一区二区三区有奶水 | 美女在线网站 | 国产妻精品一区二区在线 | 成人小视频免费观看 | 狠狠综合久久av一区二区 | 西比尔在线观看完整视频高清 | 中文字幕第九页 | 成人动态视频 | 性色网站 | 黄色在线观看www | 久久久久久久久久久久 | 久久久久久一区二区 | 91麻豆精品91久久久久同性 | 麻豆性视频 | 国产成人在线观看网站 | 污片免费看 | 黄色三级国产 | 宅男噜噜噜66一区二区 | 日本一区二区高清不卡 | 日韩一区在线播放 | 国产jk精品白丝av在线观看 | 欧美成人一区二区三区 | 观看免费av| 黑人操亚洲人 | 久草91| 肉体粗喘娇吟国产91 | 免费一级做a爰片久久毛片潮 | 国产精品久久一区 | 天堂√| 美女被爆操网站 | 小柔的淫辱日记(1~7) | 91成人在线观看喷潮蘑菇 | 久久奇米| 久久亚洲无码视频 | 国产精品久久久久国产a级 国产一区二区在线播放 | 中文字幕视频一区二区 | 日韩福利在线观看 | 在线观看免费的av | 亚洲v欧美 | 成人宗合网 | 欧美高清在线一区 | 日本高清无吗 | 91麻豆成人精品国产 | 国产色黄 | 激情av网站 | 久青草资源福利视频 | 日本黄色小网站 | 亚洲视频h | 国产又粗又黄又爽视频 | 国产激情91| 欧洲在线一区 | 好吊视频在线观看 | av手机| 99资源| 国产福利免费在线观看 | 91午夜在线 | xxxx日本少妇| 日本一区高清 | 西野翔之公侵犯中文字幕 | 婷婷成人在线 | 麻豆激情视频 | 孕妇爱爱视频 | 嫩嫩av| 色片免费观看 | 久久久久久久久久久久久国产 | 三级成人网 | 黄色大片视频网站 | 香港三级在线视频 | 亚洲xxxx视频 | 射久久久 | 国产精品久免费的黄网站 | 影音先锋啪啪 | 伊人在线视频 | 欧美影音| 男男h黄动漫啪啪无遮挡软件 | 久久草视频在线 | 久久性生活 | 日日夜夜爽爽 |