生活随笔
收集整理的這篇文章主要介紹了
Tensorflow实现MNIST数据自编码(1)
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
自編碼網絡能夠自學習樣本特征的網絡,屬于無監督學習模型的網絡,可以從無標注的數據中學習特征,它可以給出比原始數據更好的特征描述,具有較強的特征學習能力。
主要的網絡結構就是高維特征樣本---》編碼成---》低維特征---》解碼回---》高維特征,下面以MNIST數據集為示例進行演示:
import?tensorflow?as?tf??????from?tensorflow.examples.tutorials.mnist?import?input_data??mnist?=?input_data.read_data_sets('/data/',one_hot=True)??????????learning_rate?=?0.01??n_hidden_1?=?256???????n_hidden_2?=?128???????n_input?=?784????x?=?tf.placeholder('float',[None,n_input])??y?=?x????weights?=?{??????'encoder_h1':tf.Variable(tf.random_normal([n_input,n_hidden_1])),??????'encoder_h2':tf.Variable(tf.random_normal([n_hidden_1,n_hidden_2])),??????'decoder_h1':tf.Variable(tf.random_normal([n_hidden_2,n_hidden_1])),??????'decoder_h2':tf.Variable(tf.random_normal([n_hidden_1,n_input])),??}??biases?=?{??????'encoder_b1':tf.Variable(tf.zeros([n_hidden_1])),??????'encoder_b2':tf.Variable(tf.zeros([n_hidden_2])),??????'decoder_b1':tf.Variable(tf.zeros([n_hidden_1])),??????'decoder_b2':tf.Variable(tf.zeros([n_input])),??}????def?encoder(x):??????layer_1?=?tf.nn.sigmoid(tf.add(tf.matmul(x,weights['encoder_h1']),biases['encoder_b1']))??????layer_2?=?tf.nn.sigmoid(tf.add(tf.matmul(layer_1,weights['encoder_h2']),biases['encoder_b2']))??????return?layer_2??def?decoder(x):??????layer_1?=?tf.nn.sigmoid(tf.add(tf.matmul(x,weights['decoder_h1']),biases['decoder_b1']))??????layer_2?=?tf.nn.sigmoid(tf.add(tf.matmul(layer_1,weights['decoder_h2']),biases['decoder_b2']))??????return?layer_2????pred?=?decoder(encoder(x))??cost?=?tf.reduce_mean(tf.pow(y-pred,2))??optimizer?=?tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)????training_epochs?=?20????batch_size?=?256????????display_step?=?5????????????with?tf.Session()?as?sess:??????sess.run(tf.global_variables_initializer())??????total_batch?=?int(mnist.train.num_examples/batch_size)????????????for?epoch?in?range(training_epochs):??????????for?i?in?range(total_batch):??????????????batch_xs,batch_ys?=?mnist.train.next_batch(batch_size)??????????????_,c?=?sess.run([optimizer,cost],feed_dict={x:batch_xs})????????????????if?epoch?%?display_step?==?0:??????????????????print("Epoch:",'%4d'?%?(epoch+1),'cost=',"{:.9f}".format(c))??????print('Training?Finished!')????????correct_prediction?=?tf.equal(tf.argmax(pred,1),tf.argmax(y,1))??????accuracy?=?tf.reduce_mean(tf.cast(correct_prediction,'float'))??????print('Accuracy:',1-accuracy.eval({x:mnist.test.images,y:mnist.test.images}))??
總結
以上是生活随笔為你收集整理的Tensorflow实现MNIST数据自编码(1)的全部內容,希望文章能夠幫你解決所遇到的問題。
如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。