日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Theano2.1.5-基础知识之打印出theano的图

發布時間:2025/4/16 编程问答 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Theano2.1.5-基础知识之打印出theano的图 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

來自:http://deeplearning.net/software/theano/tutorial/printing_drawing.html

Printing/Drawing Theano graphs

? ? Theano提供的函數theano.printing.pprint()?和?theano.printing.debugprint()?可以用來在編譯前和后打印一個graph到終端上。?pprint()?該函數更緊湊而且更偏向于數學形式,?debugprint()?更為的詳細。 Theano同樣提供pydotprint()?來生成一張有關該函數的圖片。更詳細的可以看看?printing – Graph Printing and Symbolic Print Statement.

note:當打印theano函數的時候,有時候會比較難讀懂。為了簡化過程,可以禁止一些theano優化,只要使用theano的flag:?optimizer_excluding=fusion:inplace. 不要在工作執行的時候使用這個flag,這會使得graph更慢而且使用更多的內存。

? ? 考慮邏輯回歸的例子:

>>> import numpy >>> import theano >>> import theano.tensor as T >>> rng = numpy.random >>> # Training data >>> N = 400 >>> feats = 784 >>> D = (rng.randn(N, feats).astype(theano.config.floatX), rng.randint(size=N,low=0, high=2).astype(theano.config.floatX)) >>> training_steps = 10000 >>> # Declare Theano symbolic variables >>> x = T.matrix("x") >>> y = T.vector("y") >>> w = theano.shared(rng.randn(feats).astype(theano.config.floatX), name="w") >>> b = theano.shared(numpy.asarray(0., dtype=theano.config.floatX), name="b") >>> x.tag.test_value = D[0] >>> y.tag.test_value = D[1] >>> # Construct Theano expression graph >>> p_1 = 1 / (1 + T.exp(-T.dot(x, w)-b)) # Probability of having a one >>> prediction = p_1 > 0.5 # The prediction that is done: 0 or 1 >>> # Compute gradients >>> xent = -y*T.log(p_1) - (1-y)*T.log(1-p_1) # Cross-entropy >>> cost = xent.mean() + 0.01*(w**2).sum() # The cost to optimize >>> gw,gb = T.grad(cost, [w,b]) >>> # Training and prediction function >>> train = theano.function(inputs=[x,y], outputs=[prediction, xent], updates=[[w, w-0.01*gw], [b, b-0.01*gb]], name = "train") >>> predict = theano.function(inputs=[x], outputs=prediction, name = "predict")

友好的打印結果:

>>> theano.printing.pprint(prediction) 'gt((TensorConstant{1} / (TensorConstant{1} + exp(((-(x \\dot w)) - b)))), TensorConstant{0.5})'

調試打印

預編譯圖:

>>> theano.printing.debugprint(prediction) Elemwise{gt,no_inplace} [@A] ''|Elemwise{true_div,no_inplace} [@B] ''| |DimShuffle{x} [@C] ''| | |TensorConstant{1} [@D]| |Elemwise{add,no_inplace} [@E] ''| |DimShuffle{x} [@F] ''| | |TensorConstant{1} [@D]| |Elemwise{exp,no_inplace} [@G] ''| |Elemwise{sub,no_inplace} [@H] ''| |Elemwise{neg,no_inplace} [@I] ''| | |dot [@J] ''| | |x [@K]| | |w [@L]| |DimShuffle{x} [@M] ''| |b [@N]|DimShuffle{x} [@O] ''|TensorConstant{0.5} [@P]
編譯后的圖:

>>> theano.printing.debugprint(predict) Elemwise{Composite{GT(scalar_sigmoid((-((-i0) - i1))), i2)}} [@A] '' 4|CGemv{inplace} [@B] '' 3| |Alloc [@C] '' 2| | |TensorConstant{0.0} [@D]| | |Shape_i{0} [@E] '' 1| | |x [@F]| |TensorConstant{1.0} [@G]| |x [@F]| |w [@H]| |TensorConstant{0.0} [@D]|InplaceDimShuffle{x} [@I] '' 0| |b [@J]|TensorConstant{(1,) of 0.5} [@K]

graph的圖片打印

預編譯圖

>>> theano.printing.pydotprint(prediction, outfile="pics/logreg_pydotprint_prediction.png", var_with_name_simple=True) The output file is available at pics/logreg_pydotprint_prediction.png




編譯后的圖

>>> theano.printing.pydotprint(predict, outfile="pics/logreg_pydotprint_predict.png", var_with_name_simple=True) The output file is available at pics/logreg_pydotprint_predict.png
優化后的訓練圖: >>> theano.printing.pydotprint(train, outfile="pics/logreg_pydotprint_train.png", var_with_name_simple=True) The output file is available at pics/logreg_pydotprint_train.png

參考資料:
[1] 官網:http://deeplearning.net/software/theano/tutorial/printing_drawing.html


總結

以上是生活随笔為你收集整理的Theano2.1.5-基础知识之打印出theano的图的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。