日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Deploying Keras model on Tensorflow Serving

發(fā)布時間:2023/12/15 编程问答 31 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Deploying Keras model on Tensorflow Serving 小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.

keras訓(xùn)練了個二分類的模型。需求是把keras模型跑到 tensorflow serving上 (TensorFlow Serving 系統(tǒng)用于在生產(chǎn)環(huán)境中運(yùn)行模型)

keras模型轉(zhuǎn) tensorflow模型

我把 keras模型轉(zhuǎn)tensorflow serving模型所使用的方法如下:

1、要拿到算法訓(xùn)練好的keras模型文件(一個HDF5文件)

該文件應(yīng)該包含:

  • 模型的結(jié)構(gòu),以便重構(gòu)該模型
  • 模型的權(quán)重
  • 訓(xùn)練配置(損失函數(shù),優(yōu)化器等)
  • 優(yōu)化器的狀態(tài),以便于從上次訓(xùn)練中斷的地方開始

2、編寫 keras模型轉(zhuǎn)tensorflow serving模型的代碼

import pandas as pd import os import tensorflow as tftf.logging.set_verbosity(tf.logging.INFO) ... def build_model():############...return modeldef save_model_for_production(model, version, path='prod_models'):tf.keras.backend.set_learning_phase(1)if not os.path.exists(path):os.mkdir(path)export_path = os.path.join(tf.compat.as_bytes(path),tf.compat.as_bytes(version))builder = tf.saved_model.builder.SavedModelBuilder(export_path)model_input = tf.saved_model.utils.build_tensor_info(model.input)model_output = tf.saved_model.utils.build_tensor_info(model.output)prediction_signature = (tf.saved_model.signature_def_utils.build_signature_def(inputs={'inputs': model_input},outputs={'output': model_output},method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))with tf.keras.backend.get_session() as sess:builder.add_meta_graph_and_variables(sess=sess, tags=[tf.saved_model.tag_constants.SERVING],signature_def_map={'predict':prediction_signature,})builder.save()if __name__ == '__main__':model_file = './my_model.h5'if (os.path.isfile(model_file)):print('model file detected. Loading.')model = tf.keras.models.load_model(model_file)else:print('No model file detected. Starting from scratch.')model = build_model()model.compile(loss='binary_crossentropy', optimizer="adam", metrics=['accuracy'])model.save(model_file)model.fit(X_train, y_train, batch_size=100, epochs=1, validation_data=(X_test, y_test))model.summary()export_path = "tf-model"save_model_for_production(model, "1", export_path)

上面的例子將模型保存到 tf-model目錄下
tf-model目錄結(jié)構(gòu)如下:

tf-model/ └── 1├── saved_model.pb└── variables├── variables.data-00000-of-00001└── variables.index

saved_model.pb 是能在 tensorflow serving跑起來的模型。

3、跑模型

tensorflow_model_server --port=9000 --model_name="username" --model_base_path="/data/models/tf-model/"

標(biāo)準(zhǔn)輸出如下(算法模型已成功跑起來了):

Running ModelServer at 0.0.0.0:00 ...

4、客戶端代碼

#!/usr/bin/env python # encoding: utf-8 """ @version: v1.0 @author: zwqjoy @contact: zwqjoy@163.com @site: https://blog.csdn.net/zwqjoy @file: client @time: 2018/6/29 15:02 """from __future__ import print_function from grpc.beta import implementations import tensorflow as tffrom tensorflow_serving.apis import predict_pb2 from tensorflow_serving.apis import prediction_service_pb2 import numpy as nptf.app.flags.DEFINE_string('server', 'localhost:9000','PredictionService host:port') FLAGS = tf.app.flags.FLAGSdef main(_):host, port = FLAGS.server.split(':')channel = implementations.insecure_channel(host, int(port))stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)# Send request# See prediction_service.proto for gRPC request/response details.data = np.array([4, 0, 0, 0, 1, 0, 1])data = data.astype(np.float32)request = predict_pb2.PredictRequest()request.model_spec.name = 'username' # 這個name跟tensorflow_model_server --model_name="username" 對應(yīng)request.model_spec.signature_name = 'predict' # 這個signature_name 跟signature_def_map 對應(yīng)request.inputs['inputs'].CopyFrom(tf.contrib.util.make_tensor_proto(data, shape=(1, 7))) # shape跟 keras的model.input類型對應(yīng)result = stub.Predict(request, 10.0) # 10 secs timeoutprint(result)if __name__ == '__main__':tf.app.run()

?

客戶端跑出的結(jié)果是:

outputs {key: "output"value {dtype: DT_FLOATtensor_shape {dim {size: 1}dim {size: 1}}float_val: 0.976889811523} }

?

float_val: 0.976889811523 就是我們需要的結(jié)果(概率)

?

keras模型轉(zhuǎn) tensorflow模型的一些說明

1、 keras 保存模型

可以使用model.save(filepath)將Keras模型和權(quán)重保存在一個HDF5文件中,該文件將包含:

  • 模型的結(jié)構(gòu),以便重構(gòu)該模型
  • 模型的權(quán)重
  • 訓(xùn)練配置(損失函數(shù),優(yōu)化器等)
  • 優(yōu)化器的狀態(tài),以便于從上次訓(xùn)練中斷的地方開始

當(dāng)然這個 HDF5 也可以是用下面的代碼生成

from keras.models import load_model model.save('my_model.h5')

2、 keras 加載模型

keras 加載模型(中間部分代碼省略了):

?

import numpy as np from keras.datasets import mnist from keras.utils import np_utils from keras.models import Sequential from keras.layers import Dense from keras.optimizers import SGD from keras.models import load_model # 載入數(shù)據(jù) (x_train,y_train),(x_test,y_test) = mnist.load_data() # (60000,28,28) print('x_shape:',x_train.shape) # (60000) print('y_shape:',y_train.shape) # (60000,28,28)->(60000,784) x_train = x_train.reshape(x_train.shape[0],-1)/255.0 x_test = x_test.reshape(x_test.shape[0],-1)/255.0 # 換one hot格式 y_train = np_utils.to_categorical(y_train,num_classes=10) y_test = np_utils.to_categorical(y_test,num_classes=10)# 載入模型 model = load_model('model.h5')# 評估模型 loss,accuracy = model.evaluate(x_test,y_test)print('\ntest loss',loss) print('accuracy',accuracy)# 訓(xùn)練模型 model.fit(x_train,y_train,batch_size=64,epochs=2)# 評估模型 loss,accuracy = model.evaluate(x_test,y_test)print('\ntest loss',loss) print('accuracy',accuracy)# 保存參數(shù),載入?yún)?shù) model.save_weights('my_model_weights.h5') model.load_weights('my_model_weights.h5')

?

keras 模型轉(zhuǎn)tensorflow serving 模型的一些坑

希望能讓新手少走一些彎路

坑1:過時的生成方法

有些方法已經(jīng)過時了(例如下面這種):

from tensorflow_serving.session_bundle import exporterexport_path = ... # where to save the exported graph export_version = ... # version number (integer)saver = tf.train.Saver(sharded=True) model_exporter = exporter.Exporter(saver) signature = exporter.classification_signature(input_tensor=model.input,scores_tensor=model.output) model_exporter.init(sess.graph.as_graph_def(),default_graph_signature=signature) model_exporter.export(export_path, tf.constant(export_version), sess)

如果使用這種過時的方法,用tensorflow serving 跑模型的時候會提示:

WARNING:tensorflow:From test.py:107: Exporter.export (from tensorflow.contrib.session_bundle.exporter) is deprecated and will be removed after 2017-06-30. Instructions for updating: No longer supported. Switch to SavedModel immediately.

從warning中 顯然可以知道這種方法要被拋棄了,不再支持這種方法了, 建議我們轉(zhuǎn)用 SaveModel方法。

填坑大法: 使用 SaveModel

def save_model_for_production(model, version, path='prod_models'):tf.keras.backend.set_learning_phase(1)if not os.path.exists(path):os.mkdir(path)export_path = os.path.join(tf.compat.as_bytes(path),tf.compat.as_bytes(version))builder = tf.saved_model.builder.SavedModelBuilder(export_path)model_input = tf.saved_model.utils.build_tensor_info(model.input)model_output = tf.saved_model.utils.build_tensor_info(model.output)prediction_signature = (tf.saved_model.signature_def_utils.build_signature_def(inputs={'inputs': model_input},outputs={'output': model_output},method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))with tf.keras.backend.get_session() as sess:builder.add_meta_graph_and_variables(sess=sess, tags=[tf.saved_model.tag_constants.SERVING],signature_def_map={'predict':prediction_signature,})builder.save()

參考:

https://www.jianshu.com/p/91aae37f1da6?

Deploying Keras model on Tensorflow Serving with GPU support

https://github.com/amir-abdi/keras_to_tensorflow

總結(jié)

以上是生活随笔為你收集整理的Deploying Keras model on Tensorflow Serving的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯,歡迎將生活随笔推薦給好友。