日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問(wèn) 生活随笔!

生活随笔

當(dāng)前位置: 首頁(yè) > 编程资源 > 编程问答 >内容正文

编程问答

Sklearn,TensorFlow,keras模型保存与读取

發(fā)布時(shí)間:2025/3/20 编程问答 39 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Sklearn,TensorFlow,keras模型保存与读取 小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

一、sklearn模型保存與讀取?
1、保存

1 from sklearn.externals import joblib 2 from sklearn import svm 3 X = [[0, 0], [1, 1]] 4 y = [0, 1] 5 clf = svm.SVC() 6 clf.fit(X, y) 7 joblib.dump(clf, "train_model.m")

2、讀取

1 clf = joblib.load("train_model.m") 2 clf.predit([0,0]) #此處test_X為特征集

?

二、TensorFlow模型保存與讀取(該方式tensorflow只能保存變量而不是保存整個(gè)網(wǎng)絡(luò),所以在提取模型時(shí),我們還需要重新第一網(wǎng)絡(luò)結(jié)構(gòu)。)?
1、保存

1 import tensorflow as tf 2 import numpy as np 3 4 W = tf.Variable([[1,1,1],[2,2,2]],dtype = tf.float32,name='w') 5 b = tf.Variable([[0,1,2]],dtype = tf.float32,name='b') 6 7 init = tf.initialize_all_variables() 8 saver = tf.train.Saver() 9 with tf.Session() as sess: 10 sess.run(init) 11 save_path = saver.save(sess,"save/model.ckpt")

2、加載

1 import tensorflow as tf 2 import numpy as np 3 4 W = tf.Variable(tf.truncated_normal(shape=(2,3)),dtype = tf.float32,name='w') 5 b = tf.Variable(tf.truncated_normal(shape=(1,3)),dtype = tf.float32,name='b') 6 7 saver = tf.train.Saver() 8 with tf.Session() as sess: 9 saver.restore(sess,"save/model.ckpt")

?

三、TensorFlow模型保存與讀取(該方式tensorflow保存整個(gè)網(wǎng)絡(luò))?
1、保存

1 import tensorflow as tf 2 3 # First, you design your mathematical operations 4 # We are the default graph scope 5 6 # Let's design a variable 7 v1 = tf.Variable(1. , name="v1") 8 v2 = tf.Variable(2. , name="v2") 9 # Let's design an operation 10 a = tf.add(v1, v2) 11 12 # Let's create a Saver object 13 # By default, the Saver handles every Variables related to the default graph 14 all_saver = tf.train.Saver() 15 # But you can precise which vars you want to save under which name 16 v2_saver = tf.train.Saver({"v2": v2}) 17 18 # By default the Session handles the default graph and all its included variables 19 with tf.Session() as sess: 20 # Init v and v2 21 sess.run(tf.global_variables_initializer()) 22 # Now v1 holds the value 1.0 and v2 holds the value 2.0 23 # We can now save all those values 24 all_saver.save(sess, 'data.chkp') 25 # or saves only v2 26 v2_saver.save(sess, 'data-v2.chkp') 27 模型的權(quán)重是保存在 .chkp 文件中,模型的圖是保存在 .chkp.meta 文件中。

2、加載

1 import tensorflow as tf 2 3 # Let's laod a previous meta graph in the current graph in use: usually the default graph 4 # This actions returns a Saver 5 saver = tf.train.import_meta_graph('results/model.ckpt-1000.meta') 6 7 # We can now access the default graph where all our metadata has been loaded 8 graph = tf.get_default_graph() 9 10 # Finally we can retrieve tensors, operations, etc. 11 global_step_tensor = graph.get_tensor_by_name('loss/global_step:0') 12 train_op = graph.get_operation_by_name('loss/train_op') 13 hyperparameters = tf.get_collection('hyperparameters') 14 15 恢復(fù)權(quán)重 16 17 請(qǐng)記住,在實(shí)際的環(huán)境中,真實(shí)的權(quán)重只能存在于一個(gè)會(huì)話中。也就是說(shuō),restore 這個(gè)操作必須在一個(gè)會(huì)話中啟動(dòng),然后將數(shù)據(jù)權(quán)重導(dǎo)入到圖中。理解恢復(fù)操作的最好方法是將它簡(jiǎn)單的看做是一種數(shù)據(jù)初始化操作。 18 with tf.Session() as sess: 19 # To initialize values with saved data 20 saver.restore(sess, 'results/model.ckpt-1000-00000-of-00001') 21 print(sess.run(global_step_tensor)) # returns 1000

?

四、keras模型保存和加載

1 model.save('my_model.h5') 2 model = load_model('my_model.h5')

?

轉(zhuǎn)載于:https://www.cnblogs.com/tectal/p/9053205.html

總結(jié)

以上是生活随笔為你收集整理的Sklearn,TensorFlow,keras模型保存与读取的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。

如果覺(jué)得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。