日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Assignment | 05-week1 -Character level language model - Dinosaurus land

發(fā)布時間:2024/1/1 编程问答 67 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Assignment | 05-week1 -Character level language model - Dinosaurus land 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

該系列僅在原課程基礎(chǔ)上課后作業(yè)部分添加個人學(xué)習(xí)筆記,如有錯誤,還請批評指教。- ZJ

Coursera 課程 |deeplearning.ai |網(wǎng)易云課堂

CSDN:http://blog.csdn.net/JUNJUN_ZHAO/article/details/79409325


Welcome to Dinosaurus Island! 65 million years ago, dinosaurs existed, and in this assignment they are back. You are in charge of a special task. Leading biology researchers are creating new breeds of dinosaurs and bringing them to life on earth, and your job is to give names to these dinosaurs. If a dinosaur does not like its name, it might go beserk, so choose wisely!

歡迎來到恐龍島! 6500萬年前,恐龍就存在了,在這項任務(wù)中他們又回來了。 你負(fù)責(zé)一項特殊任務(wù)。 領(lǐng)先的生物學(xué)研究人員正在創(chuàng)造新的恐龍種類并將它們帶到地球上,您的工作就是為這些恐龍命名。 如果一只恐龍不喜歡它的名字,它可能會被人誤以為是,所以明智地選擇!

Luckily you have learned some deep learning and you will use it to save the day. Your assistant has collected a list of all the dinosaur names they could find, and compiled them into this dataset. (Feel free to take a look by clicking the previous link.) To create new dinosaur names, you will build a character level language model to generate new names. Your algorithm will learn the different name patterns, and randomly generate new names. Hopefully this algorithm will keep you and your team safe from the dinosaurs’ wrath!

幸運的是,你已經(jīng)學(xué)會了一些深入的學(xué)習(xí),你會用它來保存一天。 你的助手收集了他們可以找到的所有恐龍名稱的列表,并將它們編譯到這個dataset. 中。 (可以通過點擊上一個鏈接來觀看。)要創(chuàng)建新的恐龍名稱,您將構(gòu)建一個角色級語言模型以生成新名稱。 您的算法將學(xué)習(xí)不同的名稱模式,并隨機生成新名稱。 希望這個算法能夠讓你和你的團隊免于恐龍的憤怒!

By completing this assignment you will learn:

  • How to store text data for processing using an RNN
  • How to synthesize 合成 data, by sampling 采樣predictions at each time step and passing it to the next RNN-cell unit
  • How to build a character-level 字符級 text generation recurrent neural network
  • Why clipping the gradients 梯度裁剪 is important 防止梯度爆炸

We will begin by loading in some functions that we have provided for you in rnn_utils. Specifically, you have access to functions such as rnn_forward and rnn_backward which are equivalent to those you’ve implemented in the previous assignment.

我們將首先加載我們在rnn_utils中為您提供的一些函數(shù)。 具體而言,您可以訪問諸如rnn_forward和rnn_backward等功能,這些功能與您在前面的任務(wù)中實現(xiàn)的功能相同。

import numpy as np from utils import * import random '''utils 中的代碼'''import numpy as npdef softmax(x):e_x = np.exp(x - np.max(x))return e_x / e_x.sum(axis=0)def smooth(loss, cur_loss):return loss * 0.999 + cur_loss * 0.001def print_sample(sample_ix, ix_to_char):txt = ''.join(ix_to_char[ix] for ix in sample_ix)txt = txt[0].upper() + txt[1:] # capitalize first character print ('%s' % (txt, ), end='')def get_initial_loss(vocab_size, seq_length):return -np.log(1.0/vocab_size)*seq_lengthdef initialize_parameters(n_a, n_x, n_y):"""Initialize parameters with small random valuesReturns:parameters -- python dictionary containing:Wax -- Weight matrix multiplying the input, numpy array of shape (n_a, n_x)Waa -- Weight matrix multiplying the hidden state, numpy array of shape (n_a, n_a)Wya -- Weight matrix relating the hidden-state to the output, numpy array of shape (n_y, n_a)b -- Bias, numpy array of shape (n_a, 1)by -- Bias relating the hidden-state to the output, numpy array of shape (n_y, 1)"""np.random.seed(1)Wax = np.random.randn(n_a, n_x)*0.01 # input to hiddenWaa = np.random.randn(n_a, n_a)*0.01 # hidden to hiddenWya = np.random.randn(n_y, n_a)*0.01 # hidden to outputb = np.zeros((n_a, 1)) # hidden biasby = np.zeros((n_y, 1)) # output biasparameters = {"Wax": Wax, "Waa": Waa, "Wya": Wya, "b": b,"by": by}return parametersdef rnn_step_forward(parameters, a_prev, x):Waa, Wax, Wya, by, b = parameters['Waa'], parameters['Wax'], parameters['Wya'], parameters['by'], parameters['b']a_next = np.tanh(np.dot(Wax, x) + np.dot(Waa, a_prev) + b) # hidden statep_t = softmax(np.dot(Wya, a_next) + by) # unnormalized log probabilities for next chars # probabilities for next chars return a_next, p_tdef rnn_step_backward(dy, gradients, parameters, x, a, a_prev):gradients['dWya'] += np.dot(dy, a.T)gradients['dby'] += dyda = np.dot(parameters['Wya'].T, dy) + gradients['da_next'] # backprop into hdaraw = (1 - a * a) * da # backprop through tanh nonlinearitygradients['db'] += darawgradients['dWax'] += np.dot(daraw, x.T)gradients['dWaa'] += np.dot(daraw, a_prev.T)gradients['da_next'] = np.dot(parameters['Waa'].T, daraw)return gradientsdef update_parameters(parameters, gradients, lr):parameters['Wax'] += -lr * gradients['dWax']parameters['Waa'] += -lr * gradients['dWaa']parameters['Wya'] += -lr * gradients['dWya']parameters['b'] += -lr * gradients['db']parameters['by'] += -lr * gradients['dby']return parametersdef rnn_forward(X, Y, a0, parameters, vocab_size = 27):# Initialize x, a and y_hat as empty dictionariesx, a, y_hat = {}, {}, {}a[-1] = np.copy(a0)# initialize your loss to 0loss = 0for t in range(len(X)):# Set x[t] to be the one-hot vector representation of the t'th character in X.# if X[t] == None, we just have x[t]=0. This is used to set the input for the first timestep to the zero vector. x[t] = np.zeros((vocab_size,1)) if (X[t] != None):x[t][X[t]] = 1# Run one step forward of the RNNa[t], y_hat[t] = rnn_step_forward(parameters, a[t-1], x[t])# Update the loss by substracting the cross-entropy term of this time-step from it.loss -= np.log(y_hat[t][Y[t],0])cache = (y_hat, a, x)return loss, cachedef rnn_backward(X, Y, parameters, cache):# Initialize gradients as an empty dictionarygradients = {}# Retrieve from cache and parameters(y_hat, a, x) = cacheWaa, Wax, Wya, by, b = parameters['Waa'], parameters['Wax'], parameters['Wya'], parameters['by'], parameters['b']# each one should be initialized to zeros of the same dimension as its corresponding parametergradients['dWax'], gradients['dWaa'], gradients['dWya'] = np.zeros_like(Wax), np.zeros_like(Waa), np.zeros_like(Wya)gradients['db'], gradients['dby'] = np.zeros_like(b), np.zeros_like(by)gradients['da_next'] = np.zeros_like(a[0])### START CODE HERE #### Backpropagate through timefor t in reversed(range(len(X))):dy = np.copy(y_hat[t])dy[Y[t]] -= 1gradients = rnn_step_backward(dy, gradients, parameters, x[t], a[t], a[t-1])### END CODE HERE ###return gradients, a

1 - Problem Statement

1.1 - Dataset and Preprocessing

Run the following cell to read the dataset of dinosaur names, create a list of unique characters (such as a-z), and compute the dataset and vocabulary size.

運行以下單元格以讀取恐龍名稱的數(shù)據(jù)集,創(chuàng)建唯一字符列表(例如a-z),并計算數(shù)據(jù)集和詞匯大小。

data = open('dinos.txt', 'r').read() data= data.lower() # 小寫 chars = list(set(data)) #先轉(zhuǎn)化為集合 去除重復(fù)的,再轉(zhuǎn)化為list data_size, vocab_size = len(data), len(chars) print('There are %d total characters and %d unique characters in your data.' % (data_size, vocab_size)) There are 19909 total characters and 27 unique characters in your data.

The characters are a-z (26 characters) plus the “\n” (or newline character), which in this assignment plays a role similar to the <EOS> (or “End of sentence”) token we had discussed in lecture, only here it indicates the end of the dinosaur name rather than the end of a sentence. In the cell below, we create a python dictionary (i.e., a hash table) to map each character to an index from 0-26. We also create a second python dictionary that maps each index back to the corresponding character character. This will help you figure out what index corresponds to what character in the probability distribution output of the softmax layer. Below, char_to_ix and ix_to_char are the python dictionaries.

這些字符是az(26個字符)加上 “\n” (或換行符),它在本次任務(wù)中扮演類似于我們在講座中討論過的<EOS>(或“句尾結(jié)束” 只有在這里它表明恐龍名稱的結(jié)尾,而不是句子的結(jié)尾。 在下面的單元格中,我們創(chuàng)建了一個Python字典(即哈希表),將每個字符映射到0-26的索引。 我們還創(chuàng)建了第二個python字典,將每個索引映射回相應(yīng)的字符。 這將幫助您找出哪些索引與softmax圖層的概率分布輸出中的哪個字符相對應(yīng)。 下面,char_to_ix和ix_to_char是python字典。

char_to_ix = { ch:i for i,ch in enumerate(sorted(chars)) } ix_to_char = { i:ch for i,ch in enumerate(sorted(chars)) } print(ix_to_char) print(char_to_ix) {0: '\n', 1: 'a', 2: 'b', 3: 'c', 4: 'd', 5: 'e', 6: 'f', 7: 'g', 8: 'h', 9: 'i', 10: 'j', 11: 'k', 12: 'l', 13: 'm', 14: 'n', 15: 'o', 16: 'p', 17: 'q', 18: 'r', 19: 's', 20: 't', 21: 'u', 22: 'v', 23: 'w', 24: 'x', 25: 'y', 26: 'z'} {'\n': 0, 'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6, 'g': 7, 'h': 8, 'i': 9, 'j': 10, 'k': 11, 'l': 12, 'm': 13, 'n': 14, 'o': 15, 'p': 16, 'q': 17, 'r': 18, 's': 19, 't': 20, 'u': 21, 'v': 22, 'w': 23, 'x': 24, 'y': 25, 'z': 26}

1.2 - Overview of the model

Your model will have the following structure:

  • Initialize parameters
  • Run the optimization loop
    • Forward propagation to compute the loss function
    • Backward propagation to compute the gradients with respect to the loss function
    • Clip the gradients to avoid exploding gradients
    • Using the gradients, update your parameter with the gradient descent update rule.
  • Return the learned parameters

您的模型將具有以下結(jié)構(gòu):

  • 初始化參數(shù)
  • 運行優(yōu)化循環(huán)
    • 正向傳播以計算損失函數(shù)
    • 反向傳播以計算相對于損失函數(shù)的梯度
    • 剪切漸變以避免梯度爆炸
    • 使用梯度,使用梯度下降更新規(guī)則更新您的參數(shù)。
  • 返回學(xué)習(xí)的參數(shù)


Figure 1: Recurrent Neural Network, similar to what you had built in the previous notebook “Building a RNN - Step by Step”.

At each time-step, the RNN tries to predict what is the next character given the previous characters. The dataset X=(x?1?,x?2?,...,x?Tx?)X=(x?1?,x?2?,...,x?Tx?) is a list of characters in the training set, while Y=(y?1?,y?2?,...,y?Tx?)Y=(y?1?,y?2?,...,y?Tx?) is such that at every time-step tt, we have y?t?=x?t+1?y?t?=x?t+1?.

2 - Building blocks of the model

In this part, you will build two important blocks of the overall model:
- Gradient clipping: to avoid exploding gradients 梯度裁剪-避免梯度爆炸
- Sampling: a technique used to generate characters 采樣- 生成字符的一個技巧

You will then apply these two functions to build the model.

2.1 - Clipping the gradients in the optimization loop

In this section you will implement the clip function that you will call inside of your optimization loop. Recall that your overall loop structure usually consists of a forward pass, a cost computation, a backward pass, and a parameter update. Before updating the parameters, you will perform gradient clipping when needed to make sure that your gradients are not “exploding,” meaning taking on overly large values.

在本節(jié)中,您將實現(xiàn)您將在優(yōu)化循環(huán)中調(diào)用的clip函數(shù)。 回想一下,您的整體循環(huán)結(jié)構(gòu)通常由正向傳播,成本計算,反向傳遞和參數(shù)更新組成。 在更新參數(shù)之前,您需要在需要時執(zhí)行梯度裁剪,以確保您的梯度不會“爆炸”,這意味著會出現(xiàn)過大的值。

包含 前向廣播,損失計算,反向傳播,參數(shù)更新,4 部分。 在參數(shù)更新之前,執(zhí)行梯度修剪

In the exercise below, you will implement a function clip that takes in a dictionary of gradients and returns a clipped version of gradients if needed. There are different ways to clip gradients; we will use a simple element-wise clipping procedure, in which every element of the gradient vector is clipped to lie between some range [-N, N]. More generally, you will provide a maxValue (say 10). In this example, if any component of the gradient vector is greater than 10, it would be set to 10; and if any component of the gradient vector is less than -10, it would be set to -10. If it is between -10 and 10, it is left alone.

在下面的練習(xí)中,您將實現(xiàn)一個函數(shù)clip,它接收梯度字典并在需要時返回裁剪后版本的梯度。 有不同的方法來裁剪梯度。 我們將使用一個簡單的基于元素的裁剪程序,其中梯度矢量的每個元素都被裁剪以位于某個范圍[-N,N]之間。 更一般地說,你會提供一個maxValue(比如說10)。 在這個例子中,如果梯度矢量的任何分量大于10,它將被設(shè)置為10; 并且如果梯度矢量的任何分量小于-10,則將其設(shè)置為-10。 如果它在-10到10之間,它將被單獨放置。

我們使用最簡單的逐元乘積執(zhí)行裁剪過程 保證其在[-N, N] 區(qū)間內(nèi) ,設(shè)置 一個 最大值 10,如果任何一個 值大于10 了,則設(shè)置為 10 ,最小值同理。


Figure 2: Visualization of gradient descent with and without gradient clipping, in a case where the network is running into slight “exploding gradient” problems.

Exercise: Implement the function below to return the clipped gradients of your dictionary gradients. Your function takes in a maximum threshold and returns the clipped versions of your gradients. You can check out this hint for examples of how to clip in numpy. You will need to use the argument out = ....

### GRADED FUNCTION: clipdef clip(gradients, maxValue):'''Clips the gradients' values between minimum and maximum.Arguments:gradients -- a dictionary containing the gradients "dWaa", "dWax", "dWya", "db", "dby"maxValue -- everything above this number is set to this number, and everything less than -maxValue is set to -maxValueReturns: gradients -- a dictionary with the clipped gradients.'''### START CODE HERE #### clip to mitigate exploding gradients, loop over [dWax, dWaa, dWya, db, dby]. (≈2 lines)for name,val in gradients.items():gradients[name] = np.clip(val, -maxValue, maxValue, out=gradients[name])### END CODE HERE ###return gradients

所以,這里使用的梯度裁剪,僅僅是設(shè)置了,設(shè)置了最大值和最小值區(qū)間。去除了較大或較小的部分。

np.random.seed(3) dWax = np.random.randn(5,3)*10 dWaa = np.random.randn(5,5)*10 dWya = np.random.randn(2,5)*10 db = np.random.randn(5,1)*10 dby = np.random.randn(2,1)*10 gradients = {"dWax": dWax, "dWaa": dWaa, "dWya": dWya, "db": db, "dby": dby} gradients = clip(gradients, 10) print("gradients[\"dWaa\"][1][2] =", gradients["dWaa"][1][2]) print("gradients[\"dWax\"][3][1] =", gradients["dWax"][3][1]) print("gradients[\"dWya\"][1][2] =", gradients["dWya"][1][2]) print("gradients[\"db\"][4] =", gradients["db"][4]) print("gradients[\"dby\"][1] =", gradients["dby"][1]) gradients["dWaa"][1][2] = 10.0 gradients["dWax"][3][1] = -10.0 gradients["dWya"][1][2] = 0.2971381536101662 gradients["db"][4] = [10.] gradients["dby"][1] = [8.45833407]

* Expected output:*

**gradients[“dWaa”][1][2] ** 10.0
**gradients[“dWax”][3][1]** -10.0
**gradients[“dWya”][1][2]** 0.29713815361
**gradients[“db”][4]** [ 10.]
**gradients[“dby”][1]** [ 8.45833407]

2.2 - Sampling

Now assume that your model is trained. You would like to generate new text (characters). The process of generation is explained in the picture below:


Figure 3: In this picture, we assume the model is already trained. We pass in x?1?=0??x?1?=0→ at the first time step, and have the network then sample one character at a time.

Exercise: Implement the sample function below to sample characters. You need to carry out 4 steps:

  • Step 1: Pass the network the first “dummy” input x?1?=0??x?1?=0→ (the vector of zeros). This is the default input before we’ve generated any characters. We also set a?0?=0??a?0?=0→

  • Step 2: Run one step of forward propagation to get a?1?a?1? and y^?1?y^?1?. Here are the equations:

a?t+1?=tanh(Waxx?t?+Waaa?t?+b)(1)(1)a?t+1?=tanh?(Waxx?t?+Waaa?t?+b)

z?t+1?=Wyaa?t+1?+by(2)(2)z?t+1?=Wyaa?t+1?+by

y^?t+1?=softmax(z?t+1?)(3)(3)y^?t+1?=softmax(z?t+1?)

Note that y^?t+1?y^?t+1? is a (softmax) probability vector (its entries are between 0 and 1 and sum to 1). y^?t+1?iy^i?t+1? represents the probability that the character indexed by “i” is the next character. We have provided a softmax() function that you can use.

請注意,y^?t+1?y^?t+1?是一個(softmax)概率向量(其條目介于0和1之間且總和為1)。y^?t+1?iy^i?t+1? 表示由“i”索引的字符是下一個字符的概率。 我們提供了一個可以使用的softmax()函數(shù)。

  • Step 3: Carry out sampling: Pick the next character’s index according to the probability distribution specified by y^?t+1?y^?t+1?. This means that if y^?t+1?i=0.16y^i?t+1?=0.16, you will pick the index “i” with 16% probability. To implement it, you can use np.random.choice.

Here is an example of how to use np.random.choice():

np.random.seed(0) p = np.array([0.1, 0.0, 0.7, 0.2]) index = np.random.choice([0, 1, 2, 3], p = p.ravel())

This means that you will pick the index according to the distribution:
P(index=0)=0.1,P(index=1)=0.0,P(index=2)=0.7,P(index=3)=0.2P(index=0)=0.1,P(index=1)=0.0,P(index=2)=0.7,P(index=3)=0.2.

  • Step 4: The last step to implement in sample() is to overwrite the variable x, which currently stores x?t?x?t?, with the value of x?t+1?x?t+1?. You will represent x?t+1?x?t+1? by creating a one-hot vector corresponding to the character you’ve chosen as your prediction. You will then forward propagate x?t+1?x?t+1? in Step 1 and keep repeating the process until you get a “\n” character, indicating you’ve reached the end of the dinosaur name.

在sample()中實現(xiàn)的最后一步是覆蓋當(dāng)前存儲 x?t?x?t?的變量x,其值為 x?t+1?x?t+1?。 您將通過創(chuàng)建與您選擇的角色相對應(yīng)的一個one-hot vector 來表示x?t+1?x?t+1?作為您的預(yù)測。 然后,您將在步驟1中向前傳播 x?t+1?x?t+1?并繼續(xù)重復(fù)此過程,直到您收到 “\n” 字符,表示您已達到恐龍名稱的末尾。

# GRADED FUNCTION: sampledef sample(parameters, char_to_ix, seed):"""Sample a sequence of characters according to a sequence of probability distributions output of the RNN 采樣簡單理解為 隨機選取Arguments:parameters -- python dictionary containing the parameters Waa, Wax, Wya, by, and b. char_to_ix -- python dictionary mapping each character to an index.seed -- used for grading purposes. Do not worry about it.Returns:indices -- a list of length n containing the indices of the sampled characters."""# Retrieve parameters and relevant shapes from "parameters" dictionaryWaa, Wax, Wya, by, b = parameters['Waa'], parameters['Wax'], parameters['Wya'], parameters['by'], parameters['b']vocab_size = by.shape[0] # by: (27, 1) 根據(jù)上述公式 tag 2 中的 計算 z 時,因為是在字符級 的基礎(chǔ)上做預(yù)測,所以 by 的行坐標(biāo) 與 詞匯表大小相同n_a = Waa.shape[1] # print("Waa.shape:", Waa.shape) # print('by:', by.shape) # print('b:', b.shape)# Waa.shape: (100, 100)# by: (27, 1)### START CODE HERE #### Step 1: Create the one-hot vector x for the first character (initializing the sequence generation). (≈1 line)# x 是一個 one-hot 向量 維度是(27,1) x 字符級 所以是 27 個字符中任意一個x = np.zeros((vocab_size, 1))# Step 1': Initialize a_prev as zeros (≈1 line) 記住,這是 字符級別的,都相當(dāng)于是向量a_prev = np.zeros((n_a,1 ))# Create an empty list of indices, this is the list which will contain the list of indices of the characters to generate (≈1 line)indices = []# Idx is a flag to detect a newline character, we initialize it to -1idx = -1 # Loop over time-steps t. At each time-step, sample a character from a probability distribution and append # its index to "indices". We'll stop if we reach 50 characters (which should be very unlikely with a well # trained model), which helps debugging and prevents entering an infinite loop. counter = 0newline_character = char_to_ix['\n'] # 返回的是字符“\n”所在索引位置while (idx != newline_character and counter != 50):# Step 2: Forward propagate x using the equations (1), (2) and (3)a = np.tanh(np.matmul(Wax, x) + np.matmul(Waa, a_prev) + b)z = np.matmul(Wya, a) + byy = softmax(z)# for grading purposesnp.random.seed(counter+seed) # Step 3: Sample the index of a character within the vocabulary from the probability distribution yidx = np.random.choice(range(vocab_size), p = y.ravel())# Append the index to "indices"indices.append(idx)# Step 4: Overwrite the input character as the one corresponding to the sampled index.x = np.zeros((vocab_size, 1))x[idx] = 1# Update "a_prev" to be "a"a_prev = a# for grading purposesseed +=1counter +=1### END CODE HERE ###if (counter == 50):indices.append(char_to_ix['\n'])return indices

錯誤記錄:

np.random.seed(2) _, n_a = 20, 100 Wax, Waa, Wya = np.random.randn(n_a, vocab_size), np.random.randn(n_a, n_a), np.random.randn(vocab_size, n_a) b, by = np.random.randn(n_a, 1), np.random.randn(vocab_size, 1) parameters = {"Wax": Wax, "Waa": Waa, "Wya": Wya, "b": b, "by": by}indices = sample(parameters, char_to_ix, 0) print("Sampling:") print("list of sampled indices:", indices) print("list of sampled characters:", [ix_to_char[i] for i in indices]) Sampling: list of sampled indices: [12, 17, 24, 14, 13, 9, 10, 22, 24, 6, 13, 11, 12, 6, 21, 15, 21, 14, 3, 2, 1, 21, 18, 24, 7, 25, 6, 25, 18, 10, 16, 2, 3, 8, 15, 12, 11, 7, 1, 12, 10, 2, 7, 7, 11, 3, 6, 23, 13, 1, 0] list of sampled characters: ['l', 'q', 'x', 'n', 'm', 'i', 'j', 'v', 'x', 'f', 'm', 'k', 'l', 'f', 'u', 'o', 'u', 'n', 'c', 'b', 'a', 'u', 'r', 'x', 'g', 'y', 'f', 'y', 'r', 'j', 'p', 'b', 'c', 'h', 'o', 'l', 'k', 'g', 'a', 'l', 'j', 'b', 'g', 'g', 'k', 'c', 'f', 'w', 'm', 'a', '\n']

* Expected output:*

**list of sampled indices:** [12, 17, 24, 14, 13, 9, 10, 22, 24, 6, 13, 11, 12, 6, 21, 15, 21, 14, 3, 2, 1, 21, 18, 24,
7, 25, 6, 25, 18, 10, 16, 2, 3, 8, 15, 12, 11, 7, 1, 12, 10, 2, 7, 7, 11, 5, 6, 12, 25, 0, 0]
**list of sampled characters:** [‘l’, ‘q’, ‘x’, ‘n’, ‘m’, ‘i’, ‘j’, ‘v’, ‘x’, ‘f’, ‘m’, ‘k’, ‘l’, ‘f’, ‘u’, ‘o’,
‘u’, ‘n’, ‘c’, ‘b’, ‘a(chǎn)’, ‘u’, ‘r’, ‘x’, ‘g’, ‘y’, ‘f’, ‘y’, ‘r’, ‘j’, ‘p’, ‘b’, ‘c’, ‘h’, ‘o’,
‘l’, ‘k’, ‘g’, ‘a(chǎn)’, ‘l’, ‘j’, ‘b’, ‘g’, ‘g’, ‘k’, ‘e’, ‘f’, ‘l’, ‘y’, ‘\n’, ‘\n’]

3 - Building the language model

It is time to build the character-level language model for text generation.

3.1 - Gradient descent

In this section you will implement a function performing one step of stochastic gradient descent (with clipped gradients). You will go through the training examples one at a time, so the optimization algorithm will be stochastic gradient descent. As a reminder, here are the steps of a common optimization loop for an RNN:

  • Forward propagate through the RNN to compute the loss
  • Backward propagate through time to compute the gradients of the loss with respect to the parameters
  • Clip the gradients if necessary
  • Update your parameters using gradient descent

Exercise: Implement this optimization process (one step of stochastic gradient descent).

We provide you with the following functions:

def rnn_forward(X, Y, a_prev, parameters):""" Performs the forward propagation through the RNN and computes the cross-entropy loss.It returns the loss' value as well as a "cache" storing values to be used in the backpropagation."""....return loss, cachedef rnn_backward(X, Y, parameters, cache):""" Performs the backward propagation through time to compute the gradients of the loss with respectto the parameters. It returns also all the hidden states."""...return gradients, adef update_parameters(parameters, gradients, learning_rate):""" Updates parameters using the Gradient Descent Update Rule."""...return parameters # GRADED FUNCTION: optimizedef optimize(X, Y, a_prev, parameters, learning_rate = 0.01):"""Execute one step of the optimization to train the model.Arguments:X -- list of integers, where each integer is a number that maps to a character in the vocabulary.Y -- list of integers, exactly the same as X but shifted one index to the left.a_prev -- previous hidden state.parameters -- python dictionary containing:Wax -- Weight matrix multiplying the input, numpy array of shape (n_a, n_x)Waa -- Weight matrix multiplying the hidden state, numpy array of shape (n_a, n_a)Wya -- Weight matrix relating the hidden-state to the output, numpy array of shape (n_y, n_a)b -- Bias, numpy array of shape (n_a, 1)by -- Bias relating the hidden-state to the output, numpy array of shape (n_y, 1)learning_rate -- learning rate for the model.Returns:loss -- value of the loss function (cross-entropy)gradients -- python dictionary containing:dWax -- Gradients of input-to-hidden weights, of shape (n_a, n_x)dWaa -- Gradients of hidden-to-hidden weights, of shape (n_a, n_a)dWya -- Gradients of hidden-to-output weights, of shape (n_y, n_a)db -- Gradients of bias vector, of shape (n_a, 1)dby -- Gradients of output bias vector, of shape (n_y, 1)a[len(X)-1] -- the last hidden state, of shape (n_a, 1)"""### START CODE HERE #### Forward propagate through time (≈1 line)loss, cache = rnn_forward(X, Y, a_prev, parameters)# Backpropagate through time (≈1 line)gradients, a = rnn_backward(X, Y, parameters, cache)# Clip your gradients between -5 (min) and 5 (max) (≈1 line)gradients = clip(gradients, 5)# Update parameters (≈1 line)parameters = update_parameters(parameters, gradients, learning_rate)### END CODE HERE ###return loss, gradients, a[len(X)-1] np.random.seed(1) vocab_size, n_a = 27, 100 a_prev = np.random.randn(n_a, 1) Wax, Waa, Wya = np.random.randn(n_a, vocab_size), np.random.randn(n_a, n_a), np.random.randn(vocab_size, n_a) b, by = np.random.randn(n_a, 1), np.random.randn(vocab_size, 1) parameters = {"Wax": Wax, "Waa": Waa, "Wya": Wya, "b": b, "by": by} X = [12,3,5,11,22,3] Y = [4,14,11,22,25, 26]loss, gradients, a_last = optimize(X, Y, a_prev, parameters, learning_rate = 0.01) print("Loss =", loss) print("gradients[\"dWaa\"][1][2] =", gradients["dWaa"][1][2]) print("np.argmax(gradients[\"dWax\"]) =", np.argmax(gradients["dWax"])) print("gradients[\"dWya\"][1][2] =", gradients["dWya"][1][2]) print("gradients[\"db\"][4] =", gradients["db"][4]) print("gradients[\"dby\"][1] =", gradients["dby"][1]) print("a_last[4] =", a_last[4]) Loss = 126.50397572165383 gradients["dWaa"][1][2] = 0.1947093153471825 np.argmax(gradients["dWax"]) = 93 gradients["dWya"][1][2] = -0.007773876032003897 gradients["db"][4] = [-0.06809825] gradients["dby"][1] = [0.01538192] a_last[4] = [-1.]

* Expected output:*

**Loss ** 126.503975722
**gradients[“dWaa”][1][2]** 0.194709315347
**np.argmax(gradients[“dWax”])** 93
**gradients[“dWya”][1][2]** -0.007773876032
**gradients[“db”][4]** [-0.06809825]
**gradients[“dby”][1]** [ 0.01538192]
**a_last[4]** [-1.]

3.2 - Training the model

Given the dataset of dinosaur names, we use each line of the dataset (one name) as one training example. Every 100 steps of stochastic gradient descent, you will sample 10 randomly chosen names to see how the algorithm is doing. Remember to shuffle the dataset, so that stochastic gradient descent visits the examples in random order. 先將數(shù)據(jù)隨機打亂,這樣可以隨機選取任意樣本

Exercise: Follow the instructions and implement model(). When examples[index] contains one dinosaur name (string), to create an example (X, Y), you can use this:

index = j % len(examples)X = [None] + [char_to_ix[ch] for ch in examples[index]] Y = X[1:] + [char_to_ix["\n"]]

Note that we use: index= j % len(examples), where j = 1....num_iterations, to make sure that examples[index] is always a valid statement (index is smaller than len(examples)).
The first entry of X being None will be interpreted by rnn_forward() as setting x?0?=0??x?0?=0→. Further, this ensures that Y is equal to X but shifted one step to the left, and with an additional “\n” appended to signify the end of the dinosaur name.

# GRADED FUNCTION: modeldef model(data, ix_to_char, char_to_ix, num_iterations = 35000, n_a = 50, dino_names = 7, vocab_size = 27):"""Trains the model and generates dinosaur names. Arguments:data -- text corpusix_to_char -- dictionary that maps the index to a characterchar_to_ix -- dictionary that maps a character to an indexnum_iterations -- number of iterations to train the model forn_a -- number of units of the RNN celldino_names -- number of dinosaur names you want to sample at each iteration. 每次迭代 采樣 7 個名字vocab_size -- number of unique characters found in the text, size of the vocabularyReturns:parameters -- learned parameters"""# Retrieve 恢復(fù) n_x and n_y from vocab_sizen_x, n_y = vocab_size, vocab_size# Initialize parametersparameters = initialize_parameters(n_a, n_x, n_y)# Initialize loss (this is required because we want to smooth our loss, don't worry about it)loss = get_initial_loss(vocab_size, dino_names)# Build list of all dinosaur names (training examples).with open("dinos.txt") as f:examples = f.readlines()examples = [x.lower().strip() for x in examples]# Shuffle list of all dinosaur namesnp.random.seed(0)np.random.shuffle(examples)# Initialize the hidden state of your LSTMa_prev = np.zeros((n_a, 1))# Optimization loopfor j in range(num_iterations):### START CODE HERE #### Use the hint above to define one training example (X,Y) (≈ 2 lines)index = j%len(examples)X = [None] + [char_to_ix[ch] for ch in examples[index]] Y = X[1:] + [char_to_ix["\n"]]# Perform one optimization step: Forward-prop -> Backward-prop -> Clip -> Update parameters# Choose a learning rate of 0.01curr_loss, gradients, a_prev = optimize(X, Y, a_prev, parameters)### END CODE HERE #### Use a latency trick to keep the loss smooth. It happens here to accelerate the training.loss = smooth(loss, curr_loss)# Every 2000 Iteration, generate "n" characters thanks to sample() to check if the model is learning properlyif j % 2000 == 0:print('Iteration: %d, Loss: %f' % (j, loss) + '\n')# The number of dinosaur names to printseed = 0for name in range(dino_names):# Sample indices and print themsampled_indices = sample(parameters, char_to_ix, seed)print_sample(sampled_indices, ix_to_char)seed += 1 # To get the same result for grading purposed, increment the seed by one. print('\n')return parameters

Run the following cell, you should observe your model outputting random-looking characters at the first iteration. After a few thousand iterations, your model should learn to generate reasonable-looking names.

parameters = model(data, ix_to_char, char_to_ix) Iteration: 0, Loss: 23.087336Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Nkzxwtdmfqoeyhsqwasjkjvu Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kneb Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kzxwtdmfqoeyhsqwasjkjvu Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Neb Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Zxwtdmfqoeyhsqwasjkjvu Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eb Waa.shape: (50, 50) by: (27, 1) b: (50, 1) XwtdmfqoeyhsqwasjkjvuIteration: 2000, Loss: 27.884160Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Liusskeomnolxeros Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Hmdaairus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Hytroligoraurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lecalosapaus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xusicikoraurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Abalpsamantisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TpraneronxerosIteration: 4000, Loss: 25.901815Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mivrosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Inee Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ivtroplisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mbaaisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Wusichisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Cabaselachus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) ToraperlethosdarenitochusthiamamumamaonIteration: 6000, Loss: 24.608779Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Onwusceomosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lieeaerosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lxussaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Oma Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xusteonosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eeahosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) ToreonosaurusIteration: 8000, Loss: 24.070350Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Onxusichepriuon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kilabersaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lutrodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Omaaerosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xutrcheps Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Edaksoje Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrodiktonusIteration: 10000, Loss: 23.844446Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Onyusaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Klecalosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lustodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ola Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xusodonia Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eeaeosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TroceosaurusIteration: 12000, Loss: 23.291971Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Onyxosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kica Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lustrepiosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Olaagrraiansaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yuspangosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eealosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrognesaurusIteration: 14000, Loss: 23.382339Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meutromodromurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Inda Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Iutroinatorsaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Maca Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yusteratoptititan Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ca Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TroclosaurusIteration: 16000, Loss: 23.259291Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meustomia Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Indaadps Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Justolongchudosatrus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macabosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yuspanhosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Caaerosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrodonIteration: 18000, Loss: 22.940799Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Phusaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meicamitheastosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mussteratops Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Peg Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ytrong Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Egaltor Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrolomeIteration: 20000, Loss: 22.894192Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meutrodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lledansteh Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lwuspconyxauosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macalosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yusocichugus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiagosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrrangosaurusIteration: 22000, Loss: 22.851820Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Onustolia Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Midcagosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mwrrodonnonus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ola Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yurodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiaeptia Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrodoniohusIteration: 24000, Loss: 22.700408Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meutosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Jmacagosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kurrodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macaistel Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Yuroeleton Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiaeror Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrodonosaurusIteration: 26000, Loss: 22.736918Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Niutosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Liga Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lustoingosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Necakroia Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xrprinhtilus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiaestehastes Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrocilosaurusIteration: 28000, Loss: 22.595568Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meutosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kolaaeus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kystodonisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macahtopadrus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Xtrrararkaumurpasaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiaeosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrodmanolusIteration: 30000, Loss: 22.609381Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Meutosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Kracakosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Lustodon Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macaisthachwisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Wusqandosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Eiacosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrsatisaurusIteration: 32000, Loss: 22.251308Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mausinasaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Incaadropeglsaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Itrosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macamisaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Wuroenatoraerax Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Ehanosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) TrnanclodratosaurusIteration: 34000, Loss: 22.477910Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Mawspichaniaekorocimamroberax Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Inda Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Itrus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Macaesis Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Wrosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Elaeosaurus Waa.shape: (50, 50) by: (27, 1) b: (50, 1) Stegngosaurus

Conclusion

You can see that your algorithm has started to generate plausible dinosaur names towards the end of the training. At first, it was generating random characters, but towards the end you could see dinosaur names with cool endings. Feel free to run the algorithm even longer and play with hyperparameters to see if you can get even better results. Our implemetation generated some really cool names like maconucon, marloralus and macingsersaurus. Your model hopefully also learned that dinosaur names tend to end in saurus, don, aura, tor, etc.

If your model generates some non-cool names, don’t blame the model entirely–not all actual dinosaur names sound cool. (For example, dromaeosauroides is an actual dinosaur name and is in the training set.) But this model should give you a set of candidates from which you can pick the coolest!

This assignment had used a relatively small dataset, so that you could train an RNN quickly on a CPU. Training a model of the english language requires a much bigger dataset, and usually needs much more computation, and could run for many hours on GPUs. We ran our dinosaur name for quite some time, and so far our favoriate name is the great, undefeatable, and fierce: Mangosaurus!

你可以看到你的算法已經(jīng)開始在訓(xùn)練結(jié)束時產(chǎn)生合理的恐龍名稱。起初,它會產(chǎn)生隨機的角色,但最終你可以看到恐龍的名字與涼爽的結(jié)局。隨意運行該算法的時間更長,并使用超參數(shù)來查看是否可以獲得更好的結(jié)果。我們的實現(xiàn)產(chǎn)生了一些非常酷的名字,比如maconucon,marloralus和macingsersaurus。你的模型也希望能夠知道,恐龍的名字往往以saurus,don,aura,tor等結(jié)尾。

如果你的模型產(chǎn)生了一些不酷的名字,不要完全責(zé)怪模型 - 并不是所有的真實恐龍名字都很酷。 (例如,dromaeosauroides是一個實際的恐龍名字,并且正在訓(xùn)練集中。)但是這個模型應(yīng)該給你一組候選人,你可以從中選出最酷的!

這項任務(wù)使用了一個相對較小的數(shù)據(jù)集,因此您可以在CPU上快速訓(xùn)練RNN。培養(yǎng)英語語言模型需要一個更大的數(shù)據(jù)集,通常需要更多的計算,并且可以在GPU上運行數(shù)小時。我們恐龍的名字已經(jīng)有相當(dāng)長的一段時間了,到目前為止,我們最喜歡的名字是偉大的,不可戰(zhàn)勝的和激烈的:Mangosaurus!

4 - Writing like Shakespeare

The rest of this notebook is optional and is not graded, but we hope you’ll do it anyway since it’s quite fun and informative.

A similar (but more complicated) task is to generate Shakespeare poems. Instead of learning from a dataset of Dinosaur names you can use a collection of Shakespearian poems. Using LSTM cells, you can learn longer term dependencies that span many characters in the text–e.g., where a character appearing somewhere a sequence can influence what should be a different character much much later in ths sequence. These long term dependencies were less important with dinosaur names, since the names were quite short.

這個筆記的其余部分是可選的,并沒有評分,但我們希望你會做到這一點,因為它非常有趣和信息豐富。

一個類似的(但更復(fù)雜的)任務(wù)是生成莎士比亞詩歌。 而不是從恐龍名稱的數(shù)據(jù)集中學(xué)習(xí),你可以使用莎士比亞詩歌集。 使用 LSTM 單元,您可以學(xué)習(xí)跨越文本中多個字符的長期依賴關(guān)系 - 例如,出現(xiàn)在某個序列某個位置的某個字符可能影響該序列中很晚很久以后應(yīng)該是不同字符的位置。 這些長期的依賴對于恐龍的名字來說不那么重要,因為名字很短。


Let’s become poets!

We have implemented a Shakespeare poem generator with Keras. Run the following cell to load the required packages and models. This may take a few minutes.

from __future__ import print_function from keras.callbacks import LambdaCallback from keras.models import Model, load_model, Sequential from keras.layers import Dense, Activation, Dropout, Input, Masking from keras.layers import LSTM from keras.utils.data_utils import get_file from keras.preprocessing.sequence import pad_sequences from shakespeare_utils import * import sys import io d:\program files\python36\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.from ._conv import register_converters as _register_converters Using TensorFlow backend.Loading text data... Creating training set... number of training examples: 31412 Vectorizing training set... Loading model...

To save you some time, we have already trained a model for ~1000 epochs on a collection of Shakespearian poems called “The Sonnets”.

Let’s train the model for one more epoch. When it finishes training for an epoch—this will also take a few minutes—you can run generate_output, which will prompt asking you for an input (<40 characters). The poem will start with your sentence, and our RNN-Shakespeare will complete the rest of the poem for you! For example, try “Forsooth this maketh no sense ” (don’t enter the quotation marks). Depending on whether you include the space at the end, your results might also differ–try it both ways, and try other inputs as well.

讓我們訓(xùn)練模型再一個epoch。 當(dāng)它完成一個時代的訓(xùn)練—這也需要幾分鐘—你可以運行g(shù)enerate_output,它會提示你詢問輸入(<40個字符)。 詩將以你的句子開頭,我們的 RNN 莎士比亞將為你完成詩的其余部分! 例如,嘗試“Forsooth this maketh no sense”(不要輸入引號)。 根據(jù)最后是否包含空格,結(jié)果可能也會不同 - 嘗試兩種方法,并嘗試其他輸入。

print_callback = LambdaCallback(on_epoch_end=on_epoch_end)model.fit(x, y, batch_size=128, epochs=1, callbacks=[print_callback]) Epoch 1/1 31412/31412 [==============================] - 45s 1ms/step - loss: 2.5432<keras.callbacks.History at 0x1f40ab380f0> # Run this cell to try with different inputs without having to re-train the model generate_output() Write the beginning of your poem, the Shakespeare machine will complete it. Your input is: where are you ? my love.Here is your poem: where are you ? my love. so to eve to by monter the time the bid, and beautyso hearting foot chalke deand: the lopperveh that bace my hister live mied, my peeter's berllose briat of wrateling true, a bud my ispeles thought i ashaying wited, a wend the state's bucince i be peter tingside is care on mening beronss, bage my theors, on time thou thy srabus cide midh storms now butr, he his witth fassude it tand: i me and the

說一句額外的話,看到這生成的詩,我個人感覺,AI ML DL 進步的空間 可創(chuàng)新性 還是那么的大,情感,是詩歌的靈魂,怎么賦予機器以情感,我是很好奇的。

The RNN-Shakespeare model is very similar to the one you have built for dinosaur names. The only major differences are:
- LSTMs instead of the basic RNN to capture longer-range dependencies
- The model is a deeper, stacked LSTM model (2 layer)
- Using Keras instead of python to simplify the code

RNN-Shakespeare 模型與您為恐龍名稱建立的模型非常相似。 唯一的主要區(qū)別是:
- LSTM 而不是基本的 RNN 來捕獲更長距離的依賴關(guān)系
- 該模型是更深的堆疊 LSTM 模型(2層)
- 使用 Keras 而不是 python 來簡化代碼

If you want to learn more, you can also check out the Keras Team’s text generation implementation on GitHub: https://github.com/keras-team/keras/blob/master/examples/lstm_text_generation.py.

Congratulations on finishing this notebook!

References:
- This exercise took inspiration from Andrej Karpathy’s implementation: https://gist.github.com/karpathy/d4dee566867f8291f086. To learn more about text generation, also check out Karpathy’s blog post.
- For the Shakespearian poem generator, our implementation was based on the implementation of an LSTM text generator by the Keras team: https://github.com/keras-team/keras/blob/master/examples/lstm_text_generation.py


PS: 歡迎掃碼關(guān)注公眾號:「SelfImprovementLab」!專注「深度學(xué)習(xí)」,「機器學(xué)習(xí)」,「人工智能」。以及 「早起」,「閱讀」,「運動」,「英語 」「其他」不定期建群 打卡互助活動。

總結(jié)

以上是生活随笔為你收集整理的Assignment | 05-week1 -Character level language model - Dinosaurus land的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。

天天搞天天 | 中文字幕精品一区二区精品 | 日本婷婷色 | 九九久久婷婷 | 色网站中文字幕 | 99九九热只有国产精品 | 久久99操| 欧美一区二区日韩一区二区 | 成年人网站免费在线观看 | 亚洲综合欧美精品电影 | 91麻豆精品国产自产在线 | 丁香婷婷在线观看 | 97国产一区| 日韩美女av在线 | 五月婷网站 | 超碰电影在线观看 | 国产精品日韩在线 | 国产视频美女 | 久久免费视频这里只有精品 | 丁香五婷 | 国产精品96久久久久久吹潮 | 久久亚洲私人国产精品va | 日韩免费福利 | 99热都是精品 | 中文字幕在线看视频国产 | 国产精品福利久久久 | 最近中文字幕在线中文高清版 | 国产精品一区二区三区在线看 | 99热9| 91热这里只有精品 | 五月婷婷综合网 | 五月婷香蕉久色在线看 | 99国内精品 | 天天插日日射 | 特级西西444www大精品视频免费看 | 免费情趣视频 | 国内丰满少妇猛烈精品播 | 午夜精选视频 | 亚洲综合精品在线 | 91精品免费在线 | 国产精品第72页 | 天天天干| 国产福利免费看 | 亚洲国产欧洲综合997久久, | 91黄色在线看 | 日本在线观看一区二区三区 | 久色 网 | 91x色| 日韩欧美xxx | 日韩视频中文字幕在线观看 | 香蕉日日 | 婷婷伊人五月 | 手机在线看片日韩 | 精品国产精品一区二区夜夜嗨 | 一区二区三区在线观看免费视频 | 天天综合人人 | 日韩免费大片 | 91精品视频一区二区三区 | 最新av电影网址 | a级国产乱理论片在线观看 伊人宗合网 | 在线日韩三级 | 在线观看免费成人 | 午夜av免费看 | 国产精品九九久久久久久久 | 麻豆传媒电影在线观看 | 国产精品美女久久久久久 | 亚洲欧美视屏 | 在线观看中文字幕一区二区 | 亚洲区视频在线 | 色视频在线免费观看 | 国产亚洲婷婷免费 | 日韩在线视频二区 | 区一区二区三区中文字幕 | 国产亚洲在线观看 | 四虎国产精品成人免费影视 | 亚洲成人在线免费 | 日韩在线观看视频在线 | 婷婷久月| 久青草视频在线观看 | 亚洲黄色网络 | 狠狠色伊人亚洲综合网站野外 | 亚洲精品视频免费看 | 久久成人福利 | 久久精品视频在线免费观看 | 91女子私密保健养生少妇 | 中文字幕人成乱码在线观看 | 中文字幕综合在线 | 激情网站免费观看 | 国产高清视频在线播放一区 | 欧美电影在线观看 | 日韩性xxxx | 欧美另类调教 | www色,com| 色伊人网| 国产精品久久久久久av | 青草视频在线免费 | 婷婷99| 不卡电影一区二区三区 | 免费在线观看不卡av | 成人黄色大片在线观看 | www.久久久精品 | 日韩乱码在线 | 在线国产精品视频 | av视屏在线播放 | 日韩视频免费观看高清 | 九九热精品视频在线播放 | 亚洲高清网站 | 日韩高清在线不卡 | 天天天综合 | 亚洲aⅴ久久精品 | 国产精品美女免费看 | 一区二区中文字幕在线播放 | 大荫蒂欧美视频另类xxxx | 亚洲欧美日韩国产精品一区午夜 | 久久99精品国产91久久来源 | 国产精品6999成人免费视频 | 欧美色伊人 | 久久爱资源网 | 日韩手机在线 | 99久久精品国产网站 | 国产日韩精品欧美 | 66av99精品福利视频在线 | 在线成人欧美 | 激情开心站 | 91精品视频在线播放 | 国产精品剧情在线亚洲 | 国产九色91| 亚洲日日日 | 日韩免费观看一区二区 | 精品视频成人 | 成人啪啪18免费游戏链接 | 一区二区三区电影在线播 | 午夜12点 | 日韩免费电影网站 | 亚洲精品va| 日韩av影视| 天天摸天天干天天操天天射 | 超碰在线观看av.com | 成人黄色电影免费观看 | 久草在线资源网 | 日韩伦理片hd| 国产综合福利在线 | av中文字幕在线播放 | 久久免费av电影 | 亚洲欧美婷婷六月色综合 | 日韩伦理一区二区三区av在线 | 中文字幕成人av | 人人草在线视频 | 久久久久久久久久久久久久电影 | 中文字幕在线观看免费高清完整版 | 亚洲视频免费视频 | 91精品国产91热久久久做人人 | 亚洲国产成人在线播放 | 色视频 在线 | 成人免费在线电影 | 日本精品一区二区在线观看 | 成年人在线观看网站 | 日韩中文字幕免费在线播放 | 国产91在线免费视频 | 婷婷成人亚洲综合国产xv88 | 成人羞羞视频在线观看免费 | 日本久久久亚洲精品 | 麻豆免费看片 | 日韩免费不卡av | 国产精品免费视频一区二区 | 国产成人久久精品77777 | 国产中文字幕网 | 在线观看www91 | 亚洲综合精品视频 | 午夜久久成人 | 在线亚洲高清视频 | 日韩av视屏 | 亚洲激情在线播放 | 97超碰人人干 | 精品久久国产一区 | 国产精品a久久久久 | 爱干视频| 就要色综合 | 国产精品久久久久毛片大屁完整版 | 精品国产乱码久久久久久1区二区 | 亚洲欧美日韩国产精品一区午夜 | 国产精品孕妇 | 国产成人不卡 | 最近中文字幕国语免费高清6 | 亚洲干视频在线观看 | 欧美亚洲国产日韩 | www黄com| 91福利试看 | 亚洲在线视频免费观看 | 日韩一级电影在线 | 天天爽夜夜爽人人爽曰av | 嫩草av影院| 日韩av在线不卡 | 午夜精品久久久久久久爽 | 国产一区二区三区久久久 | 天堂av观看 | 日韩久久久久久久久久久久 | 不卡国产视频 | 丝袜制服天堂 | 欧美大片在线看免费观看 | 久草视频免费看 | 日日碰狠狠躁久久躁综合网 | 最新久久久 | 久久99免费观看 | 国产在线精品一区 | 国际精品久久久久 | 成人免费在线观看入口 | 91黄色在线视频 | 国产一级免费观看 | 国产精品高潮呻吟久久av无 | 久久免费视频国产 | 国产不卡av在线播放 | 91av蜜桃| 日韩电影一区二区在线观看 | 一级性视频 | 手机色在线 | 久久久久久久久久久网站 | 亚洲专区中文字幕 | 成 人 黄 色 片 在线播放 | 久久国产精品二国产精品中国洋人 | 色综合亚洲精品激情狠狠 | 欧美人操人| www最近高清中文国语在线观看 | 人人爽人人av | 久久精品视频国产 | 国产精品一区二区久久 | 国产综合视频在线观看 | 天天操天天添 | 亚洲成av人片 | 国产精久久久久久久 | 午夜精品久久久 | 日韩国产精品久久久久久亚洲 | 最新国产精品久久精品 | 免费日韩在线 | 亚洲另类视频在线观看 | 亚洲经典视频在线观看 | 婷婷久久综合网 | 国产成人免费网站 | 亚洲精品日韩一区二区电影 | 中文字幕亚洲综合久久五月天色无吗'' | 狠狠天天| 黄色一级大片免费看 | 成人 国产 在线 | 久久人人爽人人人人片 | 精品a级片| 亚洲欧美视频网站 | 久久精品亚洲精品国产欧美 | 在线www色 | 一区二区三区四区五区六区 | 蜜臀av在线一区二区三区 | 免费看的黄色网 | 国产成人一二片 | 深爱激情五月婷婷 | 午夜狠狠操 | 色婷婷色| 日韩三级av| 午夜少妇一区二区三区 | 国产一区二区三区 在线 | 日韩,中文字幕 | 丁香综合网 | 91视频在线观看免费 | 国产在线观看a | 九九热国产视频 | 91精品国产乱码久久 | av电影一区二区三区 | 久久国产精品久久w女人spa | 黄色av网站在线免费观看 | 干天天| 激情开心| 97视频人人 | 激情久久网 | 国产麻豆精品久久 | 911国产| 国产视频一 | 三级黄色在线 | 久久久久久久久久久影院 | 精品美女在线视频 | 99视频在线精品 | 丁香婷婷网 | 国产黄色a | 亚洲电影久久 | 国产精品视频久久 | 91精选在线观看 | 精品999在线观看 | 日韩欧美视频在线免费观看 | 精品高清美女精品国产区 | 亚洲精品久久在线 | 成年人国产在线观看 | 国产资源在线视频 | 美女国产| 成人免费xxx在线观看 | 欧美一级片在线免费观看 | 亚洲污视频 | 午夜国产一区二区 | 伊人天堂av| 人人射人人爽 | 中文字幕一区二区三区四区在线视频 | 国产黄色精品视频 | 欧美 日韩 国产 中文字幕 | 亚洲一区免费在线 | 韩日av在线| 免费观看国产精品视频 | 91视频免费观看 | 亚洲国产精品免费 | 国产精品福利午夜在线观看 | 成人av一区二区三区 | av在线一级| 久久久精品视频成人 | 91在线国内视频 | 2019中文最近的2019中文在线 | 色视频成人在线观看免 | 日韩在线视频免费看 | 一区二区三区www | 91精品爽啪蜜夜国产在线播放 | 中文字幕亚洲精品在线观看 | 9在线观看免费高清完整版 玖玖爱免费视频 | 国产馆在线播放 | 中文字幕av全部资源www中文字幕在线观看 | 成在人线av | 麻豆91在线观看 | 亚洲网站在线看 | 亚洲第一伊人 | 日韩一区在线播放 | 午夜精品电影 | 国产精品少妇 | 怡红院av久久久久久久 | 国产视频日韩视频欧美视频 | 97手机电影网 | 欧美色操 | 在线观看精品一区 | 婷婷开心久久网 | 日韩videos| 国产xx视频 | 亚洲综合国产精品 | 精久久久久 | 91成人国产| 2023国产精品自产拍在线观看 | 九九视频在线观看视频6 | 人人澡澡人人 | 日韩理论 | 国产1区2区3区在线 亚洲自拍偷拍色图 | 成片视频在线观看 | 成 人 免费 黄 色 视频 | 中文字幕一区二区三区乱码在线 | 久久99精品久久久久久久久久久久 | 99热这里精品 | 国产精品久久久久久久午夜片 | 91香蕉视频色版 | 亚洲美女在线一区 | 午夜精品久久 | 久久大片网站 | 婷婷久久综合九色综合 | 久久成人综合视频 | 午夜精品一区二区三区免费视频 | 日本在线h| 亚洲高清在线精品 | 久久精品日产第一区二区三区乱码 | 开心激情综合网 | 亚洲三级在线 | 欧美男男激情videos | 在线观看国产www | 日日弄天天弄美女bbbb | 丁香久久综合 | 亚洲精品久久久久中文字幕二区 | 黄色影院在线播放 | www.五月天色| 久草在线视频资源 | 一本一本久久aa综合精品 | 九九热re | 中文字幕综合在线 | 日韩黄视频 | 黄色免费在线视频 | 日韩资源视频 | 天天操天天射天天爱 | 在线观看国产永久免费视频 | 福利一区在线 | 99视频免费播放 | 久久免费一 | 欧美国产不卡 | 国产拍揄自揄精品视频麻豆 | 成人精品999 | 成人午夜电影久久影院 | 久久99久久99精品中文字幕 | 欧美精品一二三 | 久久久久久久久久久久久久电影 | 手机色站 | 久久精品99国产 | 免费观看不卡av | 成人三级av| 免费久久99精品国产婷婷六月 | 国产一区二区三区免费视频 | 免费看国产精品 | 91av综合 | 成人av影院在线观看 | 国产性天天综合网 | 天天撸夜夜操 | 亚洲国产免费看 | 成人黄色大片在线免费观看 | 日韩二区三区 | 久久免费久久 | 一区二区三区四区在线免费观看 | 伊人久久电影网 | 日韩黄在线观看 | av高清不卡 | 999久久久久久 | 免费观看xxxx9999片 | 激情综合国产 | 欧美性生活久久 | 中文字幕高清在线播放 | 中文字幕国产一区 | 91免费视频网站在线观看 | 国产精品成人av久久 | 亚洲综合激情小说 | 伊人婷婷色 | 欧美伦理一区二区 | 中文字幕资源在线 | 一本一本久久a久久 | .国产精品成人自产拍在线观看6 | 中文字幕韩在线第一页 | 超碰97久久 | 人人插人人爱 | 日韩精品中文字幕有码 | 麻豆久久精品 | 久草国产精品 | 久久久久久久久久久久亚洲 | 成人夜晚看av | 成人在线播放视频 | 久久精品久久99精品久久 | 亚洲丝袜中文 | 在线免费视频你懂的 | 国产精品久久久久久爽爽爽 | 干天天 | 91成人在线免费观看 | 久久99精品久久久久久久久久久久 | 中文字幕91在线 | 成年人视频免费在线播放 | 国产三级视频在线 | 日韩一区二区三区高清免费看看 | 午夜久久久久久久久 | 成人午夜电影久久影院 | 最近中文字幕高清字幕免费mv | 国产黄| 91资源在线播放 | 久久久精品视频成人 | 国产无遮挡又黄又爽在线观看 | 十八岁以下禁止观看的1000个网站 | 亚洲日日夜夜 | 日本中文字幕网址 | 狠狠狠色丁香综合久久天下网 | 福利视频午夜 | 国产一级电影免费观看 | www.天天操.com | 国产99久久久国产精品成人免费 | 亚洲精品久久久久久中文传媒 | 91久久久久久国产精品 | 国产精品黄色av | 四虎影视8848aamm | 欧美精品在线观看免费 | 日本视频网 | 国产精品久久久精品 | 精品国产一区二区三区男人吃奶 | www色片| 91在线91拍拍在线91 | 精品久久久久久久久久久久久久久久 | 国产三级午夜理伦三级 | 成年人天堂com | 中文字幕免费久久 | 免费在线国产视频 | 国产成人av电影在线观看 | 四虎视频| 99精品国产视频 | 国产精品九九九 | 欧美精品久 | 色婷婷激婷婷情综天天 | 99久久久久成人国产免费 | 亚洲免费一级 | 欧美日韩在线观看不卡 | 国产成人精品在线播放 | 欧美日韩国产精品一区二区 | 国偷自产视频一区二区久 | 精品女同一区二区三区在线观看 | www色网站| 国产无套一区二区三区久久 | 久久国内精品视频 | 日韩久久电影 | 韩国中文三级 | 婷婷日日 | 婷婷五月色综合 | 日韩在线免费高清视频 | 在线观看岛国av | 一级a毛片高清视频 | 欧美成人在线网站 | 天海冀一区二区三区 | 一区二区三区在线播放 | 国产3p视频 | 九九免费观看全部免费视频 | 99久热在线精品视频成人一区 | 免费又黄又爽的视频 | 91精品视频免费看 | 国产一区欧美在线 | 亚洲国产成人在线观看 | 日韩欧美在线视频一区二区三区 | 九九视频免费在线观看 | 超碰免费久久 | 成人免费在线观看电影 | 美女网站在线看 | 久久久久久综合网天天 | 久久久久亚洲精品中文字幕 | 久草在在线 | 最近中文字幕视频网 | 日韩爱爱片 | 青草视频在线播放 | 欧美最猛性xxxxx(亚洲精品) | 成人在线观看av | 久久久2o19精品 | 99热 精品在线 | 在线视频观看国产 | 91成人蝌蚪| 亚洲2019精品| 国产一区 在线播放 | 国产精品久久久久久久久久了 | 亚洲 av网站 | 色欧美视频 | 久久香蕉一区 | 99色免费视频 | 婷婷开心久久网 | 永久av免费在线观看 | 手机在线看永久av片免费 | 开心色插 | 天堂网av在线 | 国产一级电影在线 | 成人久久视频 | 欧美成人性网 | 免费日韩 精品中文字幕视频在线 | 91资源在线免费观看 | 西西4444www大胆无视频 | 国产一级黄色av | 91福利在线观看 | 欧美日韩一区二区视频在线观看 | 久久久久一区二区三区四区 | 怡红院av久久久久久久 | 永久免费精品视频网站 | 国产无遮挡又黄又爽馒头漫画 | 久久字幕精品一区 | 日韩精品视频免费专区在线播放 | 国产精品视频免费观看 | 色91在线 | 色香蕉视频| 黄色毛片视频免费 | 麻豆小视频在线观看 | 亚洲欧美日韩精品久久奇米一区 | 国产精品一区免费观看 | 中文国产字幕在线观看 | 91最新网址 | 国产精品一区二区白浆 | 国产一区二区在线免费播放 | 91成人看片 | 亚洲国产一区在线观看 | 91精品老司机久久一区啪 | 天天草天天干天天 | 久久综合网色—综合色88 | 日日碰狠狠添天天爽超碰97久久 | 亚州激情视频 | 亚洲精品国产综合久久 | 97免费| 日韩免费在线看 | 五月开心婷婷网 | 欧美激情第八页 | 亚洲精品国产精品国自产 | 粉嫩aⅴ一区二区三区 | 亚洲精品国偷拍自产在线观看 | 在线免费av观看 | 99色资源| 五月婷婷一区二区三区 | 国产精品视频最多的网站 | 91精品播放| 久久中文精品视频 | av先锋中文字幕 | 国产午夜三级一区二区三 | 中文字幕在线观看免费高清电影 | 免费色av| 91大神精品视频在线观看 | 久久综合久久综合这里只有精品 | 久久免费看视频 | 久久婷婷视频 | 麻豆视频国产精品 | 久久综合久久久久88 | 香蕉久久国产 | 在线观看视频黄 | 国产黄色资源 | 四虎国产精品成人免费影视 | 亚洲一区二区三区91 | 视频国产一区二区三区 | 国内揄拍国产精品 | 99产精品成人啪免费网站 | 中文字幕资源网在线观看 | 草莓视频在线观看免费观看 | 91成人在线视频观看 | 国产小视频在线观看 | 日韩视频一 | 日韩精品一区在线播放 | 特级毛片爽www免费版 | 久久国产视频网站 | 精品久久网 | 在线 国产 日韩 | 性色xxxxhd | 天天操天天干天天操天天干 | 六月丁香在线观看 | 婷婷去俺也去六月色 | 日日操网站 | 欧美日韩免费一区 | 九九热1 | 麻豆国产网站 | 91av看片 | 国内揄拍国内精品 | 69性欧美 | 久久久久久高清 | 国产91精品看黄网站在线观看动漫 | 久久日韩精品 | 精品国内自产拍在线观看视频 | 91麻豆视频 | 亚洲精品国产精品国自产在线 | 91热精品 | 西西4444www大胆无视频 | 中文字幕在线日 | 国产精品欧美激情在线观看 | 日产乱码一二三区别在线 | 久久经典国产视频 | 国产精品久久久久久久久久免费看 | 国产日韩欧美在线一区 | 99久久精品国产欧美主题曲 | 久久久精品久久日韩一区综合 | 懂色av一区二区在线播放 | 五月亚洲婷婷 | 国产日本亚洲高清 | 午夜少妇一区二区三区 | 中文字幕国产精品 | 成片免费观看视频999 | 国产成人99av超碰超爽 | 日本视频精品 | 97在线观视频免费观看 | 日韩精品一区二区三区中文字幕 | 国产成人一区二区三区在线观看 | 亚洲mv大片欧洲mv大片免费 | www.888av| 久久公开免费视频 | 天天爱天天操天天射 | 国产黄在线免费观看 | 69国产在线观看 | 日本中文字幕在线一区 | 婷婷色婷婷 | 91九色porn在线资源 | 免费看亚洲毛片 | 免费高清无人区完整版 | 国产免费久久av | 手机成人在线电影 | 99久久综合国产精品二区 | 亚洲精品动漫成人3d无尽在线 | 婷婷丁香激情 | 手机在线小视频 | 国产高清日韩欧美 | 欧美日韩在线视频一区 | 免费日韩三级 | 欧美日韩在线视频观看 | 欧美性生活一级片 | 91视频免费网站 | 99精品在这里 | 日韩资源在线观看 | 99久久精品国产网站 | 深夜国产福利 | 国产网站av | 国产在线视频一区二区 | 日韩欧美在线观看一区二区三区 | 91精品网站在线观看 | 97超碰超碰| 黄色美女免费网站 | 国产成人精品综合 | 亚洲国产精品视频 | 99国产视频在线 | 久久在线精品视频 | 免费观看成人 | 又黄又刺激 | 免费a现在观看 | 久久伦理 | 成人午夜电影久久影院 | 欧美日韩在线观看一区 | 99久久精品国产亚洲 | 一区二区三区免费在线观看视频 | 日韩av偷拍 | 国产高清视频网 | 免费高清男女打扑克视频 | 亚洲免费不卡 | 国产成人一区二区三区免费看 | 久久精品视频3 | 精品在线视频观看 | 久久久精选 | 国产麻豆精品久久 | 四虎在线视频免费观看 | 天天干天天摸天天操 | 欧美另类成人 | 欧美日韩在线视频一区 | 91日韩在线播放 | 日韩黄色一区 | 69久久久 | 五月av在线 | 日本中文乱码卡一卡二新区 | 精品国产乱码一区二 | 国产精品一区一区三区 | av成人在线观看 | 亚洲黄色免费网站 | 在线观看视频一区二区三区 | 国产精品9区 | 日韩av影视在线 | 91九色在线观看 | 黄色三级在线观看 | 成年人免费看av | 国模一二三区 | 一本一本久久a久久精品综合 | 日本精品久久久久中文字幕5 | 色狠狠一区二区 | 日韩一区二区久久 | 亚洲五月综合 | 国内精品久久久久久久久久久 | 日韩一区二区免费视频 | 中文字幕精品一区二区精品 | 午夜久久福利影院 | 久久国产热| 天天操天天干天天综合网 | 欧美日韩在线观看视频 | 日韩精品一区二区三区免费观看视频 | 国产一级视频在线观看 | 在线观看成人 | 久久婷婷五月综合色丁香 | 午夜av剧场 | 一区二区三区在线视频观看58 | 久久成人在线视频 | 欧美一级性生活 | 超碰人人草人人 | 久久伊人综合 | 在线免费黄色片 | 一区二区中文字幕在线观看 | 亚洲一二区精品 | 国产亚洲精品精品精品 | 99av在线视频 | 久久理论片| 国产不卡一区二区视频 | 久久视频一区二区 | 成人黄色中文字幕 | 天天干天天操 | 美女免费视频观看网站 | 国产精品igao视频网网址 | 日韩在线免费 | 中文字幕在线观看第一区 | 久久精品2 | 成人国产精品一区二区 | 在线国产激情视频 | 亚洲精品高清在线 | 日本中文字幕视频 | 手机看国产毛片 | 国产黄色片久久久 | 波多野结衣久久精品 | 色爱成人网 | 免费看国产曰批40分钟 | 久久精品人 | 在线播放国产精品 | 国产精品毛片一区二区 | 一区二区三区国产欧美 | 一级黄色电影网站 | 91久久久国产精品 | 中文一区二区三区在线观看 | 成人少妇影院yyyy | 国内精品久久久久久久久久久久 | 不卡中文字幕av | 女人18毛片a级毛片一区二区 | 啪啪免费试看 | 亚洲精品中文字幕在线观看 | 97免费公开视频 | 精品一区二区在线看 | 一区二区在线不卡 | 日本高清dvd | 天天天综合| av色综合网 | 国产九九在线 | 人人看人人草 | 国精产品999国精产 久久久久 | 免费一级特黄毛大片 | 六月激情久久 | 日b黄色片| www.久久色| 五月激情姐姐 | 久草在线视频免费资源观看 | 91在线精品观看 | 又黄又爽又刺激视频 | 午夜精品久久久久久久99婷婷 | 精品视频亚洲 | 日本婷婷色 | 国产一区视频免费在线观看 | 中文字幕免费播放 | 另类五月激情 | 免费在线观看av网站 | 免费在线观看日韩欧美 | 香蕉日日 | 又大又硬又黄又爽视频在线观看 | 激情欧美一区二区免费视频 | 久草视频在线资源 | 久久午夜网 | 日p在线观看 | 一级黄色毛片 | 欧美在线观看视频一区二区 | 亚洲精品免费视频 | 天天色视频 | 亚洲在线视频观看 | 亚洲精品乱码久久久久久蜜桃不爽 | 国产又粗又长的视频 | 国产一区二区三区四区大秀 | www日韩视频 | 免费黄色在线播放 | 国产高清在线观看 | 99色在线视频 | 黄色大片av | 国色天香av | 麻豆久久精品 | 日韩午夜剧场 | 黄色三级久久 | 免费在线观看日韩欧美 | 天天插视频 | 国产91亚洲精品 | av免费高清观看 | 天天操夜夜摸 | 亚洲天堂精品视频在线观看 | 四虎在线免费观看 | 中文字幕 影院 | 国产一区在线观看免费 | 国产一级片一区二区三区 | 九九热在线免费观看 | 国产不卡在线观看视频 | 国产无区一区二区三麻豆 | 久久久久久久久久久久久久免费看 | 欧美日韩破处 | 国模一区二区三区四区 | 91麻豆精品国产91久久久使用方法 | 91精品1区 | 久久久精品国产一区二区电影四季 | 久久精品久久久精品美女 | 午夜成人影视 | 日韩电影一区二区在线 | 久久xx视频| 成人免费观看大片 | 在线探花| 国产小视频在线观看免费 | 99热官网| 亚洲精品黄色在线观看 | 国产视频精品免费 | 日韩中文字幕第一页 | 99久久网站 | 日韩视频在线观看视频 | 国产精品99精品久久免费 | 天天草天天干天天 | 久草视频在线资源 | www.国产视频| 天天综合久久综合 | 顶级bbw搡bbbb搡bbbb | 久久极品 | 91一区二区三区在线观看 | 国产成人精品亚洲日本在线观看 | 四虎成人精品在永久免费 | 91网页版免费观看 | 成年人在线观看 | 国产午夜激情视频 | 免费一区在线 | 免费色视频网站 | 日韩免费视频一区二区 | 免费久久99精品国产婷婷六月 | 国产理论在线 | 免费看特级毛片 | 91亚州 | 国产精品区一区 | 国产精品尤物视频 | 人人干人人做 | 99亚洲国产 | 久久精品国亚洲 | 永久av免费在线观看 | 欧美俄罗斯性视频 | 日韩深夜在线观看 | 韩日av在线 | 99资源网 | 中文视频在线播放 | 免费看片网站91 | www日韩精品| 中文字幕一二 | 国产剧情在线一区 | 免费观看成人网 | 99热精品免费观看 | 日本三级国产 | 狠狠干夜夜操天天爽 | 一区二区三区在线观看免费 | 欧美日韩高清一区二区三区 | 又黄又爽又湿又无遮挡的在线视频 | 成年人免费在线观看 | 黄色精品一区二区 | 成人av片免费观看app下载 | av丝袜在线 | 最近中文字幕mv | 久久久久国产一区二区 | 国产视频手机在线 | 91一区啪爱嗯打偷拍欧美 | 国产精品自产拍在线观看网站 | 国内精品久久天天躁人人爽 | 又色又爽的网站 | 久久综合色天天久久综合图片 | 96精品在线 | 色诱亚洲精品久久久久久 | 国产精久久久久久久 | 毛片网在线 | 欧美日韩在线播放一区 | av成人在线看 | 久草在线免费在线观看 | 欧美成人精品三级在线观看播放 | 国产只有精品 | 欧美成a人片在线观看久 | 在线免费视频 你懂得 | 国产成人久久av977小说 | 日韩中文字幕视频在线观看 | 香蕉影院在线 | 国产精品久久久电影 | 一区二区三区精品在线 | 五月婷在线观看 | 92精品国产成人观看免费 | 精品福利视频在线 | 免费在线91 | 97精品伊人 | 欧美一级激情 | 久久综合射 | 日韩电影中文字幕在线 | 欧美日比视频 | www.伊人网.com| 五月天com | 欧美一二三区播放 | 美女免费网视频 | av中文在线影视 | 色就干| 久久精品成人 | 欧美精品久久久久久久亚洲调教 | 伊人开心激情 | 国产毛片久久 | 欧美一级性生活 | 久久久免费精品 | 日韩理论在线播放 | 中文字幕 91 | 久草精品视频 | 久久爽久久爽久久av东京爽 | 四虎影视成人永久免费观看视频 | 97在线免费观看 | 探花国产在线 | 久久免费黄色网址 | 激情五月五月婷婷 | 江苏妇搡bbbb搡bbbb | 亚洲日本精品视频 | 国产网红在线观看 | 久久婷婷五月综合色丁香 | av在线电影网站 | 久久久久女教师免费一区 | 亚洲精品国产精品99久久 | 五月开心色 | 99精品国产在热久久下载 | 日韩欧美视频一区二区三区 | 久章草在线| 黄色大片日本 | a午夜电影 | 91chinesexxx | 干干干操操操 | 日韩精品一区二区三区水蜜桃 | 精品久久久久久久久久久院品网 | 欧美极品xxx | 四虎影视精品永久在线观看 | 91香蕉国产在线观看软件 | 亚洲精品乱码久久久久久久久久 | 日韩中文字幕91 | 欧美精品乱码久久久久久 | 色在线视频| av不卡免费在线观看 | 青青视频一区 | 欧美色精品天天在线观看视频 | 中文字幕亚洲情99在线 | 九九九在线观看视频 | 91大神dom调教在线观看 | avwww在线观看 | 操操操com | 亚洲欧洲精品一区二区精品久久久 | 亚洲狠狠 | 天天操天天添 | 日韩电影一区二区在线观看 | 国产精品久久久久婷婷二区次 | 一本一道久久a久久精品 | 在线观看中文字幕第一页 | 国内精品久久久久久久久久 | 92精品国产成人观看免费 | 国产xvideos免费视频播放 |