日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 人工智能 > pytorch >内容正文

pytorch

6.深度学习练习:Initialization

發(fā)布時(shí)間:2023/12/10 pytorch 37 豆豆
生活随笔 收集整理的這篇文章主要介紹了 6.深度学习练习:Initialization 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

本文節(jié)選自吳恩達(dá)老師《深度學(xué)習(xí)專項(xiàng)課程》編程作業(yè),在此表示感謝。

?

課程鏈接:https://www.deeplearning.ai/deep-learning-specialization/

目錄

1 - Neural Network model

2 - Zero initialization

3 - Random initialization(掌握)

4 - He initialization(理解)


To get started, run the following cell to load the packages and the planar dataset you will try to classify.

import numpy as np import matplotlib.pyplot as plt import sklearn import sklearn.datasets from init_utils import sigmoid, relu, compute_loss, forward_propagation, backward_propagation from init_utils import update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec%matplotlib inline plt.rcParams['figure.figsize'] = (7.0, 4.0) # set default size of plots plt.rcParams['image.interpolation'] = 'nearest' plt.rcParams['image.cmap'] = 'gray'# load image dataset: blue/red dots in circles train_X, train_Y, test_X, test_Y = load_dataset()


1 - Neural Network model

You will use a 3-layer neural network (already implemented for you). Here are the initialization methods you will experiment with:

  • Zeros initialization?-- setting?initialization = "zeros"?in the input argument.
  • Random initialization?-- setting?initialization = "random"?in the input argument. This initializes the weights to large random values.
  • He initialization?-- setting?initialization = "he"?in the input argument. This initializes the weights to random values scaled according to a paper by He et al., 2015.

Instructions: Please quickly read over the code below, and run it. In the next part you will implement the three initialization methods that this?model()?calls.

def model(X, Y, learning_rate = 0.01, num_iterations = 15000, print_cost = True, initialization = "he"):"""Implements a three-layer neural network: LINEAR->RELU->LINEAR->RELU->LINEAR->SIGMOID.Arguments:X -- input data, of shape (2, number of examples)Y -- true "label" vector (containing 0 for red dots; 1 for blue dots), of shape (1, number of examples)learning_rate -- learning rate for gradient descent num_iterations -- number of iterations to run gradient descentprint_cost -- if True, print the cost every 1000 iterationsinitialization -- flag to choose which initialization to use ("zeros","random" or "he")Returns:parameters -- parameters learnt by the model"""grads = {}costs = [] # to keep track of the lossm = X.shape[1] # number of exampleslayers_dims = [X.shape[0], 10, 5, 1]# Initialize parameters dictionary.if initialization == "zeros":parameters = initialize_parameters_zeros(layers_dims)elif initialization == "random":parameters = initialize_parameters_random(layers_dims)elif initialization == "he":parameters = initialize_parameters_he(layers_dims)# Loop (gradient descent)for i in range(0, num_iterations):# Forward propagation: LINEAR -> RELU -> LINEAR -> RELU -> LINEAR -> SIGMOID.a3, cache = forward_propagation(X, parameters)# Losscost = compute_loss(a3, Y)# Backward propagation.grads = backward_propagation(X, Y, cache)# Update parameters.parameters = update_parameters(parameters, grads, learning_rate)# Print the loss every 1000 iterationsif print_cost and i % 1000 == 0:print("Cost after iteration {}: {}".format(i, cost))costs.append(cost)# plot the lossplt.plot(costs)plt.ylabel('cost')plt.xlabel('iterations (per hundreds)')plt.title("Learning rate =" + str(learning_rate))plt.show()return parameters

2 - Zero initialization

There are two types of parameters to initialize in a neural network:

the weight matrices:

the bias vectors:

Exercise: Implement the following function to initialize all parameters to zeros. You'll see later that this does not work well since it fails to "break symmetry", but lets try it anyway and see what happens. Use np.zeros((..,..)) with the correct shapes.

def initialize_parameters_zeros(layers_dims):"""Arguments:layer_dims -- python array (list) containing the size of each layer.Returns:parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":W1 -- weight matrix of shape (layers_dims[1], layers_dims[0])b1 -- bias vector of shape (layers_dims[1], 1)...WL -- weight matrix of shape (layers_dims[L], layers_dims[L-1])bL -- bias vector of shape (layers_dims[L], 1)"""parameters = {}L = len(layers_dims) # number of layers in the networkfor l in range(1, L):parameters['W' + str(l)] = np.zeros((layers_dims[l], layers_dims[l-1]))parameters['b' + str(l)] = np.zeros((layers_dims[l], 1))return parameters parameters = initialize_parameters_zeros([3,2,1]) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"]))W1 = [[0. 0. 0.][0. 0. 0.]] b1 = [[0.][0.]] W2 = [[0. 0.]] b2 = [[0.]]parameters = model(train_X, train_Y, initialization = "zeros") print ("On the train set:") predictions_train = predict(train_X, train_Y, parameters) print ("On the test set:") predictions_test = predict(test_X, test_Y, parameters) plt.title("Model with Zeros initialization") axes = plt.gca() axes.set_xlim([-1.5,1.5]) axes.set_ylim([-1.5,1.5]) plot_decision_boundary(lambda x: predict_dec(parameters, x.T), train_X, np.squeeze(train_Y))

The model is predicting 0 for every example.

In general, initializing all the weights to zero results in the network failing to break symmetry. This means that every neuron in each layer will learn the same thing, and you might as well be training a neural network with??[?]=1 for every layer, and the network is no more powerful than a linear classifier such as logistic regression.

**What you should remember**: - The weights? should be initialized randomly to break symmetry. - It is however okay to initialize the biases to zeros. Symmetry is still broken so long as is initialized randomly.


3 - Random initialization(掌握)

To break symmetry, lets intialize the weights randomly. Following random initialization, each neuron can then proceed to learn a different function of its inputs. In this exercise, you will see what happens if the weights are intialized randomly, but to very large values.

Exercise: Implement the following function to initialize your weights to large random values (scaled by *10) and your biases to zeros. Use?np.random.randn(..,..) * 10?for weights and?np.zeros((.., ..))?for biases. We are using a fixed?np.random.seed(..)?to make sure your "random" weights match ours, so don't worry if running several times your code gives you always the same initial values for the parameters.

def initialize_parameters_random(layers_dims):"""Arguments:layer_dims -- python array (list) containing the size of each layer.Returns:parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":W1 -- weight matrix of shape (layers_dims[1], layers_dims[0])b1 -- bias vector of shape (layers_dims[1], 1)...WL -- weight matrix of shape (layers_dims[L], layers_dims[L-1])bL -- bias vector of shape (layers_dims[L], 1)"""np.random.seed(3) # This seed makes sure your "random" numbers will be the as oursparameters = {}L = len(layers_dims) # integer representing the number of layersfor l in range(1, L):parameters['W' + str(l)] = np.random.randn(layers_dims[l], layers_dims[l-1]) * 10parameters['b' + str(l)] = np.zeros((layers_dims[l], 1))return parameters parameters = model(train_X, train_Y, initialization = "random") print ("On the train set:") predictions_train = predict(train_X, train_Y, parameters) print ("On the test set:") predictions_test = predict(test_X, test_Y, parameters)

**In summary**: - Initializing weights to very large random values does not work well. - Hopefully intializing with small random values does better. The important question is: how small should be these random values be? Lets find out in the next part!


4 - He initialization(理解)

Finally, try "He Initialization"; this is named for the first author of He et al., 2015. (If you have heard of "Xavier initialization", this is similar except Xavier initialization uses a scaling factor for the weights??[?]W[l]?of?sqrt(1./layers_dims[l-1])?where He initialization would use?sqrt(2./layers_dims[l-1]).)

Exercise: Implement the following function to initialize your parameters with He initialization.

Hint: This function is similar to the previous?initialize_parameters_random(...). The only difference is that instead of multiplying?np.random.randn(..,..)?by 10, you will multiply it by ,which is what He initialization recommends for layers with a ReLU activation.

# GRADED FUNCTION: initialize_parameters_hedef initialize_parameters_he(layers_dims):"""Arguments:layer_dims -- python array (list) containing the size of each layer.Returns:parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":W1 -- weight matrix of shape (layers_dims[1], layers_dims[0])b1 -- bias vector of shape (layers_dims[1], 1)...WL -- weight matrix of shape (layers_dims[L], layers_dims[L-1])bL -- bias vector of shape (layers_dims[L], 1)"""np.random.seed(3)parameters = {}L = len(layers_dims) - 1 # integer representing the number of layersfor l in range(1, L + 1):### START CODE HERE ### (≈ 2 lines of code)parameters['W' + str(l)] = np.random.randn(layers_dims[l], layers_dims[l-1]) * np.sqrt(2 / layers_dims[l-1])parameters['b' + str(l)] = np.zeros((layers_dims[l], 1))### END CODE HERE ###return parameters

總結(jié)

以上是生活随笔為你收集整理的6.深度学习练习:Initialization的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。

主站蜘蛛池模板: 国毛片 | 欧美精品一区二区成人 | av毛片在线免费看 | 99re6在线精品视频免费播放 | 国产女人视频 | 国产盗摄精品一区二区酒店 | xxxxwww一片 | 欧美三级黄 | 欧美一区二区三区粗大 | 久久在线视频免费观看 | 手机在线永久免费观看av片 | 色不卡| www.一区二区三区 | 免费网站在线高清观看 | a v免费视频 | 国产精品美女一区 | 免费成人深夜小野草 | 成人性生活毛片 | 人人干超碰 | 777四色 | 亚洲在线第一页 | 色狠狠av老熟女 | 日本激情视频一区二区三区 | 国产一级在线观看视频 | 国产精品无码av在线有声小说 | 免费日韩毛片 | 激情丁香婷婷 | 国产第一毛片 | 一区二区三区视频在线观看免费 | 亚洲在线观看免费 | 青青久在线视频 | 巨物撞击尤物少妇呻吟 | 韩产日产国产欧产 | 黄色国产在线 | 青青青在线| 日韩电影在线一区 | 精品国产AV色欲天媒传媒 | 人禽l交视频在线播放 视频 | 国产精品成人久久久 | 中文字幕乱码在线观看 | 69视频在线播放 | 国产精品揄拍一区二区 | 欧美日本亚洲 | 好爽快一点高潮了 | 久久久久极品 | 国产夫妻精品 | 中文字幕第二页 | 亚洲妇女体内精汇编 | av一级在线 | 久久久国产精品黄毛片 | 91蜜桃在线观看 | 中文字幕日韩一级 | 灌满闺乖女h高h调教尿h | 懂色av蜜臀av粉嫩av分享吧最新章节 | 久久久久国产精品区片区无码 | 韩国女同性做爰三级 | 亚av在线 | 大尺度摸揉捏胸床戏视频 | 久久久久99精品成人片我成大片 | 女人做爰全过程免费观看美女 | 国产极品999 | 香蕉视频黄色在线观看 | 色妞色视频一区二区三区四区 | 成人av中文解说水果派 | 免费在线国产精品 | www夜夜 | 久久一线 | 黄色片免费观看视频 | 女女h百合无遮羞羞漫画软件 | 韩国伦理片在线播放 | 夜色快播 | 黄色片免费观看 | 亚洲啪视频| 亚洲经典一区 | 强行无套内谢大学生初次 | 激情久久综合 | 成人伊人网 | 老色批永久免费网站www | 舐丝袜脚视频丨vk | 日本视频免费在线 | 中文字幕日韩久久 | 黄色a大片 | 天天做天天射 | 欧美一区二区三区网站 | 99久久精品免费看国产交换 | 婷婷玖玖| 久久久久久久久久一区二区三区 | av黄色国产 | 久久精品香蕉视频 | 久人人 | 欧美另类人妖 | 日韩少妇视频 | 天天操,夜夜操 | 91视频福利 | 国产欧美一区二区三区白浆喷水 | 久久aⅴ乱码一区二区三区 亚洲成人18 | 中文字幕第15页 | 伊人久久国产 | 成人av影视在线观看 |