日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 人工智能 > pytorch >内容正文

pytorch

原文翻译:深度学习测试题(L1 W4 测试题)

發布時間:2025/3/8 pytorch 33 豆豆
生活随笔 收集整理的這篇文章主要介紹了 原文翻译:深度学习测试题(L1 W4 测试题) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

導語

本文翻譯自deeplearning.ai的深度學習課程測試作業,近期將逐步翻譯完畢,一共五門課。

翻譯:黃海廣

本集翻譯Lesson1 Week 4:

Lesson1 Neural Networks and Deep Learning (第一門課 神經網絡和深度學習)

Week 4 Quiz - Key concepts on Deep Neural Networks(第四周測驗 – 深層神經網絡)

1.What is the “cache” used for in our implementation of forward propagation and backward propagation?? ?

(在實現前向傳播和反向傳播中使用的“cache”是什么?)

【 】It is used to cache the intermediate values of the cost function during training.(用于在訓練期間緩存成本函數的中間值。)

【★】We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.(我們用它傳遞前向傳播中計算的變量到相應的反向傳播步驟,它包含用于計算導數的反向傳播的有用值。)

【 】It is used to keep track of the hyperparameters that we are searching over, to speed up computation.(它用于跟蹤我們正在搜索的超參數,以加速計算。)

【 】We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.(我們使用它將向后傳播計算的變量傳遞給相應的正向傳播步驟,它包含用于計算計算激活的正向傳播的有用值。)

Note: the “cache” records values from the forward propagation units and sends it to the backward propagation units because it is needed to compute the chain rule derivatives.(請注意:“cache”記錄來自正向傳播單元的值并將其發送到反向傳播單元,因為需要鏈式計算導數。)

2. Among the following, which ones are “hyperparameters”? (Check all that apply.) I only list correct options.

(以下哪些是“超參數”?只列出了正確選項)

【★】size of the hidden layers?(隱藏層的大小)

【★】learning rate α(學習率α)

【★】number of iterations(迭代次數)

【★】number of layers??in the neural network(神經網絡中的層數)

Note: You can check this Quora post orthis blog post.(請注意:你可以查看Quora的這篇文章或者這篇博客.)

3. Which of the following statements is true?(下列哪個說法是正確的?)

【★】The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers. (神經網絡的更深層通常比前面的層計算更復雜的輸入特征。)

【 】 The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.(神經網絡的前面的層通常比更深層計算更復雜的輸入特征。)

Note: You can check the lecture videos. I think Andrew used a CNN example to explain this.(注意:您可以查看視頻,我想用吳恩達的用美國有線電視新聞網的例子來解釋這個。)

4. Vectorization allows you to compute forward propagation in an?-layer neural network without an explicit for-loop (or any other explicit iterative loop) over the layers l=1, 2, …,L. True/False?

(向量化允許您在層神經網絡中計算前向傳播,而不需要在層(l = 1,2,…,L)上顯式的使用for-loop(或任何其他顯式迭代循環),正確嗎?)

【 】 True(正確)

【★】 False(錯誤)

Note: We cannot avoid the for-loop iteration over the computations among layers.(請注意:在層間計算中,我們不能避免for循環迭代。)

5. Assume we store the values for??in an array called layers, as follows: layer_dims = [, 4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?

(假設我們將的值存儲在名為layers的數組中,如下所示:layer_dims = [, 4,3,2,1]。因此,第1層有四個隱藏單元,第2層有三個隱藏單元,依此類推。您可以使用哪個for循環初始化模型參數?)

for(i in range(1, len(layer_dims))):parameter[‘W’ + str(i)] = np.random.randn(layers[i], layers[i - 1])) * 0.01 `parameter[‘b’ + str(i)] = np.random.randn(layers[i], 1) * 0.01

6. Consider the following neural network.

(下面關于神經網絡的說法正確的是:只列出了正確選項)

【★】The number of layers??is 4. The number of hidden layers is 3.(層數為4,隱藏層數為3)

Note: The input layer () does not count.(注意:輸入層()不計數。)

As seen in lecture, the number of layers is counted as the number of hidden layers + 1. The input and output layers are not counted as hidden layers.(正如視頻中所看到的那樣,層數被計為隱藏層數+1。輸入層和輸出層不計為隱藏層。)

7. During forward propagation, in the forward function for a layer??you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During backpropagation, the corresponding backward function also needs to know what is the activation function for layer?, since the gradient depends on it. True/False?

(在前向傳播期間,在層的前向傳播函數中,您需要知道層中的激活函數(Sigmoid,tanh,ReLU等)是什么, 在反向傳播期間,相應的反向傳播函數也需要知道第層的激活函數是什么,因為梯度是根據它來計算的,正確嗎?)

【★】 True(正確)

【 】False(錯誤)

Note: During backpropagation you need to know which activation was used in the forward propagation to be able to compute the correct derivative.(注:在反向傳播期間,您需要知道正向傳播中使用哪種激活函數才能計算正確的導數。)

8.There are certain functions with the following properties:

(有一些函數具有以下屬性:)

(i) To compute the function using a shallow network circuit, you will need a large network (where we measure size by the number of logic gates in the network), but (ii) To compute it using a deep network circuit, you need only an exponentially smaller network. True/False?((i)使用淺網絡電路計算函數時,需要一個大網絡(我們通過網絡中的邏輯門數量來度量大小),但是(ii)使用深網絡電路來計算它,只需要一個指數較小的網絡。真/假?)

【★】True(正確)

【 】False(錯誤)

Note: See lectures, exactly same idea was explained.(參見視頻,完全相同的題。)

9. Consider the following 2 hidden layer neural network: Which of the following statements are True? (Check all that apply).

((在2層隱層神經網絡中,下列哪個說法是正確的?只列出了正確選項))

【★】?will have shape (4, 4)(的維度為 (4, 4))

【★】?will have shape (4, 1)(的維度為 (4, 1))

【★】?will have shape (3, 4)(的維度為 (3, 4))

【★】?will have shape (3, 1)(的維度為 (3, 1))

【★】?will have shape (1, 1)(的維度為 (1, 1))

【★】?will have shape (1, 3)(的維度為 (1, 3))

Note: See [this image] for general formulas.(注:請參閱圖片。)

10. Whereas the previous question used a specific network, in the general case what is the dimension of??, the weight matrix associated with layer??

(前面的問題使用了一個特定的網絡,與層ll有關的權重矩陣在一般情況下,?的維數是多少,只列出了正確選項)

【★】?has shape (,)(的維度是 (,)

Note: See this imagefor general formulas.(注:請參閱圖片)

備注:公眾號菜單包含了整理了一本AI小抄非常適合在通勤路上用學習

往期精彩回顧2019年公眾號文章精選適合初學者入門人工智能的路線及資料下載機器學習在線手冊深度學習在線手冊AI基礎下載(第一部分)備注:加入本站微信群或者qq群,請回復“加群”加入知識星球(4500+用戶,ID:92416895),請回復“知識星球”

喜歡文章,點個在看

總結

以上是生活随笔為你收集整理的原文翻译:深度学习测试题(L1 W4 测试题)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。