日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

pytorch 和 tensorflow2.0 方法替换

發布時間:2024/7/5 编程问答 23 豆豆
生活随笔 收集整理的這篇文章主要介紹了 pytorch 和 tensorflow2.0 方法替换 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

Embedding初始化

pytorch: Embedding()
tf2.0: random.normal()

# 驗證均值和方差 def confirm(weight):mean = np.sum(weight) / dimprint("均值: {}".format(mean))square_sum = np.sum((mean - weight) ** 2)print("方差: {}".format(square_sum / dim))dim = 1000000 # dim越大,均值、方差越接近0和1embd = nn.Embedding(5, dim) # 默認 訓練參數服從(0,1)正態分布 weight = embd.weight.data[0].numpy() confirm(weight)embd2 = tf.Variable(tf.random.normal([5, dim])) # 設置為(0,1)正態分布 weight2 = embd2.numpy()[0] confirm(weight2)

張量初始化

pytorch: xavier_uniform_()
tf2.0: GlorotUniform()

def confirm(weight):mean = np.sum(weight) / dimprint("均值: {}".format(mean))square_sum = np.sum((mean - weight) ** 2)print("方差: {}".format(square_sum / dim))dim = 1000000 w = nn.Parameter(torch.zeros(size=(3, dim))) nn.init.xavier_uniform_(w.data) weight = w.data[0].numpy() confirm(weight)initializer = tf.initializers.GlorotUniform() w2 = tf.Variable(initializer(shape=[3, dim])) weight2 = w2[0].numpy() confirm(weight2)

多分類交叉熵損失

pytorch: CrossEntropyLoss()
tf2.0: categorical_crossentropy()

input = np.random.random((3,3)) input_p = torch.tensor(input) input_t = tf.convert_to_tensor(input)target_p = torch.tensor([1,2,2]) target_t1 = tf.keras.utils.to_categorical([1,2,2]) target_t2 = tf.constant([1,2,2]) target_t3 = tf.one_hot([1,2,2], depth=3)p_f = torch.nn.CrossEntropyLoss() loss1 = p_f(input_p,target_p) print(loss1)# 方法一 loss2 = tf.losses.categorical_crossentropy(y_true=target_t1,y_pred=tf.nn.softmax(input_t,axis=1)) print(tf.reduce_mean(loss2))# 方法二 loss3 = tf.keras.losses.sparse_categorical_crossentropy(y_true=target_t2, y_pred=tf.nn.softmax(input_t,axis=1)) print(tf.reduce_mean(loss3))# 方法三 loss4 = tf.keras.losses.categorical_crossentropy(y_true=target_t3, y_pred=tf.nn.softmax(input_t,axis=1)) print(tf.reduce_mean(loss4))

二分類交叉熵損失

pytorch: BCEWithLogitsLoss()
tf2.0: sigmoid_cross_entropy_with_logits()

input = np.random.random((3,3)) input_p = torch.tensor(input) input_t = tf.convert_to_tensor(input)target = np.array([[0.,1.,1.],[0.,0.,1.],[1.,0.,1.]]) target_p = torch.tensor(target) target_t = tf.convert_to_tensor(target)p_f = torch.nn.BCEWithLogitsLoss() loss1 = p_f(input_p,target_p) print(loss1)# 方法一 loss2 = tf.nn.sigmoid_cross_entropy_with_logits(logits=input_t, labels=target_t) print(tf.reduce_mean(loss2))# 方法二 loss_fn = tf.keras.losses.BinaryCrossentropy(from_logits=True) loss3 = loss_fn(y_true=target_t, y_pred=input_t) print(tf.reduce_mean(loss3))

總結

以上是生活随笔為你收集整理的pytorch 和 tensorflow2.0 方法替换的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。