亚洲在线久爱草,狠狠天天香蕉网,天天搞日日干久草,伊人亚洲日本欧美

為了賬號安全,請及時綁定郵箱和手機立即綁定
已解決430363個問題,去搜搜看,總會有你想問的

在 Tensorflow2 中將圖形凍結為 pb

在 Tensorflow2 中將圖形凍結為 pb

九州編程 2022-06-07 19:11:21
我們從 TF1 部署了很多模型,通過圖形凍結來保存它們:tf.train.write_graph(self.session.graph_def, some_path)# get graph definitions with weightsoutput_graph_def = tf.graph_util.convert_variables_to_constants(        self.session,  # The session is used to retrieve the weights        self.session.graph.as_graph_def(),  # The graph_def is used to retrieve the nodes        output_nodes,  # The output node names are used to select the usefull nodes)# optimize graphif optimize:    output_graph_def = optimize_for_inference_lib.optimize_for_inference(            output_graph_def, input_nodes, output_nodes, tf.float32.as_datatype_enum    )with open(path, "wb") as f:    f.write(output_graph_def.SerializeToString())然后通過以下方式加載它們:with tf.Graph().as_default() as graph:    with graph.device("/" + args[name].processing_unit):        tf.import_graph_def(graph_def, name="")            for key, value in inputs.items():                self.input[key] = graph.get_tensor_by_name(value + ":0")我們想以類似的方式保存 TF2 模型。一個包含圖形和權重的 protobuf 文件。我怎樣才能做到這一點?我知道有一些保存方法:keras.experimental.export_saved_model(model, 'path_to_saved_model')這是實驗性的并創建多個文件:(。model.save('path_to_my_model.h5')保存 h5 格式:(。tf.saved_model.save(self.model, "test_x_model")這又保存了多個文件:(。
查看完整描述

3 回答

?
守著一只汪

TA貢獻1872條經驗 獲得超4個贊

上面的代碼有點舊。轉換vgg16可以成功,但是轉換resnet_v2_50模型失敗。我的 tf 版本是 tf 2.2.0 最后,我找到了一個有用的代碼片段:


import tensorflow as tf

from tensorflow import keras

from tensorflow.python.framework.convert_to_constants import     convert_variables_to_constants_v2

import numpy as np



#set resnet50_v2 as a example

model = tf.keras.applications.ResNet50V2()

 

full_model = tf.function(lambda x: model(x))

full_model = full_model.get_concrete_function(

    tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype))


# Get frozen ConcreteFunction

frozen_func = convert_variables_to_constants_v2(full_model)

frozen_func.graph.as_graph_def()

 

layers = [op.name for op in frozen_func.graph.get_operations()]

print("-" * 50)

print("Frozen model layers: ")

for layer in layers:

    print(layer)

 

print("-" * 50)

print("Frozen model inputs: ")

print(frozen_func.inputs)

print("Frozen model outputs: ")

print(frozen_func.outputs)

 

# Save frozen graph from frozen ConcreteFunction to hard drive

tf.io.write_graph(graph_or_graph_def=frozen_func.graph,

                  logdir="./frozen_models",

                  name="frozen_graph.pb",

                  as_text=False)

參考:https ://github.com/leimao/Frozen_Graph_TensorFlow/tree/master/TensorFlow_v2 (更新)


查看完整回答
反對 回復 2022-06-07
?
有只小跳蛙

TA貢獻1824條經驗 獲得超8個贊

我使用 TF2 轉換模型,如:

  1. 訓練時傳遞keras.callbacks.ModelCheckpoint(save_weights_only=True)model.fit保存;checkpoint

  2. 訓練后,self.model.load_weights(self.checkpoint_path)加載checkpoint并轉換為h5self.model.save(h5_path, overwrite=True, include_optimizer=False);

  3. 轉換h5pb

import logging

import tensorflow as tf

from tensorflow.compat.v1 import graph_util

from tensorflow.python.keras import backend as K

from tensorflow import keras


# necessary !!!

tf.compat.v1.disable_eager_execution()


h5_path = '/path/to/model.h5'

model = keras.models.load_model(h5_path)

model.summary()

# save pb

with K.get_session() as sess:

    output_names = [out.op.name for out in model.outputs]

    input_graph_def = sess.graph.as_graph_def()

    for node in input_graph_def.node:

        node.device = ""

    graph = graph_util.remove_training_nodes(input_graph_def)

    graph_frozen = graph_util.convert_variables_to_constants(sess, graph, output_names)

    tf.io.write_graph(graph_frozen, '/path/to/pb/model.pb', as_text=False)

logging.info("save pb successfully!")


查看完整回答
反對 回復 2022-06-07
?
喵喵時光機

TA貢獻1846條經驗 獲得超7個贊

我遇到了類似的問題,并在下面找到了解決方案,即

from tensorflow.python.framework.convert_to_constants import convert_variables_to_constants_v2

from tensorflow.python.tools import optimize_for_inference_lib


loaded = tf.saved_model.load('models/mnist_test')

infer = loaded.signatures['serving_default']

f = tf.function(infer).get_concrete_function(

                            flatten_input=tf.TensorSpec(shape=[None, 28, 28, 1], 

                                                        dtype=tf.float32)) # change this line for your own inputs

f2 = convert_variables_to_constants_v2(f)

graph_def = f2.graph.as_graph_def()

if optimize :

    # Remove NoOp nodes

    for i in reversed(range(len(graph_def.node))):

        if graph_def.node[i].op == 'NoOp':

            del graph_def.node[i]

    for node in graph_def.node:

        for i in reversed(range(len(node.input))):

            if node.input[i][0] == '^':

                del node.input[i]

    # Parse graph's inputs/outputs

    graph_inputs = [x.name.rsplit(':')[0] for x in frozen_func.inputs]

    graph_outputs = [x.name.rsplit(':')[0] for x in frozen_func.outputs]

    graph_def = optimize_for_inference_lib.optimize_for_inference(graph_def,

                                                                  graph_inputs,

                                                                  graph_outputs,

                                                                  tf.float32.as_datatype_enum)

# Export frozen graph

with tf.io.gfile.GFile('optimized_graph.pb', 'wb') as f:

    f.write(graph_def.SerializeToString())


查看完整回答
反對 回復 2022-06-07
  • 3 回答
  • 0 關注
  • 367 瀏覽
慕課專欄
更多

添加回答

舉報

0/150
提交
取消
微信客服

購課補貼
聯系客服咨詢優惠詳情

幫助反饋 APP下載

慕課網APP
您的移動學習伙伴

公眾號

掃描二維碼
關注慕課網微信公眾號