问题描述
究竟应该如何导出python模型以供在c++中使用?
Exactly how should python models be exported for use in c++?
我正在尝试做一些类似于本教程的事情:https://www.tensorflow.org/versions/r0.8/tutorials/image_recognition/index.html
I'm trying to do something similar to this tutorial: https://www.tensorflow.org/versions/r0.8/tutorials/image_recognition/index.html
我正在尝试在 C++ API 中导入我自己的 TF 模型,而不是最初的模型.我调整了输入大小和路径,但奇怪的错误不断出现.我花了一整天的时间阅读堆栈溢出和其他论坛,但无济于事.
I'm trying to import my own TF model in the c++ API in stead of the inception one. I adjusted input size and the paths, but strange errors keep popping up. I spent all day reading stack overflow and other forums but to no avail.
我尝试了两种导出图表的方法.
I've tried two methods for exporting the graph.
方法一:元图.
...loading inputs, setting up the model, etc....
sess = tf.InteractiveSession()
sess.run(tf.initialize_all_variables())
for i in range(num_steps):
x_batch, y_batch = batch(50)
if i%10 == 0:
train_accuracy = accuracy.eval(feed_dict={
x:x_batch, y_: y_batch, keep_prob: 1.0})
print("step %d, training accuracy %g"%(i, train_accuracy))
train_step.run(feed_dict={x: x_batch, y_: y_batch, keep_prob: 0.5})
print("test accuracy %g"%accuracy.eval(feed_dict={
x: features_test, y_: labels_test, keep_prob: 1.0}))
saver = tf.train.Saver(tf.all_variables())
checkpoint =
'/home/sander/tensorflow/tensorflow/examples/cat_face/data/model.ckpt'
saver.save(sess, checkpoint)
tf.train.export_meta_graph(filename=
'/home/sander/tensorflow/tensorflow/examples/cat_face/data/cat_graph.pb',
meta_info_def=None,
graph_def=sess.graph_def,
saver_def=saver.restore(sess, checkpoint),
collection_list=None, as_text=False)
方法 1 在尝试运行程序时会产生以下错误:
Method 1 yields the following error when trying to run the program:
[libprotobuf ERROR
google/protobuf/src/google/protobuf/wire_format_lite.cc:532] String field
'tensorflow.NodeDef.op' contains invalid UTF-8 data when parsing a protocol
buffer. Use the 'bytes' type if you intend to send raw bytes.
E tensorflow/examples/cat_face/main.cc:281] Not found: Failed to load
compute graph at 'tensorflow/examples/cat_face/data/cat_graph.pb'
我还尝试了另一种导出图形的方法:
I also tried another method of exporting the graph:
方法二:write_graph:
Method 2: write_graph:
tf.train.write_graph(sess.graph_def,
'/home/sander/tensorflow/tensorflow/examples/cat_face/data/',
'cat_graph.pb', as_text=False)
这个版本实际上似乎加载了一些东西,但我收到一个关于变量未初始化的错误:
This version actually seems to load something, but I get an error about variables not being initialized:
Running model failed: Failed precondition: Attempting to use uninitialized
value weight1
[[Node: weight1/read = Identity[T=DT_FLOAT, _class=["loc:@weight1"],
_device="/job:localhost/replica:0/task:0/cpu:0"](weight1)]]
推荐答案
首先,您需要使用以下命令将图形定义到文件中
At first, you need to graph definition to file by using following command
with tf.Session() as sess:
//Build network here
tf.train.write_graph(sess.graph.as_graph_def(), "C:\output\", "mymodel.pb")
然后,使用 saver 保存您的模型
Then, save your model by using saver
saver = tf.train.Saver(tf.global_variables())
saver.save(sess, "C:\output\mymodel.ckpt")
然后,您的输出将有 2 个文件,mymodel.ckpt,mymodel.pb
Then, you will have 2 files at your output, mymodel.ckpt, mymodel.pb
从这里 并在 C:output 中运行以下命令.如果输出节点名称与您不同,请更改它.
Download freeze_graph.py from here and run following command in C:output. Change output node name if it is different for you.
python freeze_graph.py --input_graph mymodel.pb --input_checkpoint mymodel.ckpt --output_node_names softmax/Reshape_1 --output_graph mymodelforc.pb
python freeze_graph.py --input_graph mymodel.pb --input_checkpoint mymodel.ckpt --output_node_names softmax/Reshape_1 --output_graph mymodelforc.pb
您可以直接从 C 中使用 mymodelforc.pb.
You can use mymodelforc.pb directly from C.
您可以使用以下 C 代码加载 proto 文件
You can use following C code to load the proto file
#include "tensorflow/core/public/session.h"
#include "tensorflow/core/platform/env.h"
#include "tensorflow/cc/ops/image_ops.h"
Session* session;
NewSession(SessionOptions(), &session);
GraphDef graph_def;
ReadBinaryProto(Env::Default(), "C:\output\mymodelforc.pb", &graph_def);
session->Create(graph_def);
现在您可以使用会话进行推理.
Now you can use session for inference.
您可以应用以下推理参数:
You can apply inference parameter as following:
// Same dimension and type as input of your network
tensorflow::Tensor input_tensor(tensorflow::DT_FLOAT, tensorflow::TensorShape({ 1, height, width, channel }));
std::vector<tensorflow::Tensor> finalOutput;
// Fill input tensor with your input data
std::string InputName = "input"; // Your input placeholder's name
std::string OutputName = "softmax/Reshape_1"; // Your output placeholder's name
session->Run({ { InputName, input_tensor } }, { OutputName }, {}, &finalOutput);
// finalOutput will contain the inference output that you search for
这篇关于从 Python 导出 Tensorflow 图以在 C++ 中使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!