tensorflow serving

1.安装tensorflow serving

1.1确保当前环境已经安装并可运行tensorflow

从github上下载源码

git clone --recurse-submodules https://github.com/tensorflow/serving

进入到serving目录下的tensorflow运行./configure,并安装步骤完成(需将 2问题解决的的步骤全操作完后执行安装步骤)

1.2.编译example代码

bazel build tensorflow_serving/example/...

1.3.运行mnist例子导出model到/tmp/mnist_export目录下,目录下会根据export_version创建一个目录名为 /tmp/mnist_export/00000001

rm -rf /tmp/mnist_export/(第一次执行不存在,不必操作)
 
bazel-bin/tensorflow_serving/example/mnist_export --training_iteration=10000 --export_version=1 /tmp/mnist_export
 
Training model...
('Extracting''/tmp/train-images-idx3-ubyte.gz')
('Extracting''/tmp/train-labels-idx1-ubyte.gz')
('Extracting''/tmp/t10k-images-idx3-ubyte.gz')
('Extracting''/tmp/t10k-labels-idx1-ubyte.gz')
training accuracy 0.9219
Done training!
Exporting trained model to /tmp/mnist_export
Done exporting!

1.4 执行inference开启服务,端口9000,目录指向之前导出的目录

bazel-bin/tensorflow_serving/example/mnist_inference --port=9000 /tmp/mnist_export/00000001
I tensorflow_serving/session_bundle/session_bundle.cc:130] Attempting to load a SessionBundle from: /tmp/mnist_export/00000001
I tensorflow_serving/session_bundle/session_bundle.cc:107] Running restore op for SessionBundle
I tensorflow_serving/session_bundle/session_bundle.cc:178] Done loading SessionBundle
I tensorflow_serving/example/mnist_inference.cc:163] Running...

1.6 执行测试client

bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000
 
('Extracting''/tmp/train-images-idx3-ubyte.gz')
('Extracting''/tmp/train-labels-idx1-ubyte.gz')
('Extracting''/tmp/t10k-images-idx3-ubyte.gz')
('Extracting''/tmp/t10k-labels-idx1-ubyte.gz')
........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Inference error rate: 9.2%

2.问题解决

no such package '@boringssl_git//': Error cloning repository:https://boringssl.googlesource.com/boringssl: cannot open git-upload-pack and referenced by '//external:libssl'.

由于GFW把google的很多地址给墙了,所以无法下载相关的内容,修改seving目录下的tensorflow/tensorflow/workspace.bzl 文件相关的repository

git_repository(
name = "boringssl_git",
#commit = "436432d849b83ab90f18773e4ae1c7a8f148f48d",
commit = "db0729054d5964feab9e60089ba2d06a181e78b1",
init_submodules = True,
)

https://github.com/tensorflow/serving/issues/6

运行mnist_client时报错

Traceback (most recent call last):
File "/root/tensorflow-serving/bazel-bin/tensorflow_serving/example/mnist_client.runfiles/__main__/tensorflow_serving/example/mnist_client.py", line 34, in <module>
from grpc.beta import implementations
ImportError: No module named grpc.beta

使用pip安装grpcio模块

pip install grpcio

https://github.com/grpc/grpc/tree/master/src/python/grpcio

export过程报错缺少 manifest_pb2.py 的解决方法:

首先编译serving下的example目录得到

bazel-bin/tensorflow_serving/example/mnist_export.runfiles/org_tensorflow/tensorflow/contrib/session_bundle/manifest_pb2.py

随后copy到python主目录下的lib目录 例如:/usr/lib/python2.7/site-packages/

如果还报相同错误

修改 /usr/lib/python2.7/site-packages/tensorflow/contrib/session_bundle/目录下的exporter.py文件

删除 from tensorflow.contrib.session_bundle import manifest_pb2

增加 import manifest_pb2

解决思路:主要就是PYTHONPATH中缺少manifest_pb2.py,所以需要设置,设置路径可以找到manifest_pb2.py即可

manifest_pb2.py为tensorflow/contrib/session_bundle/manifest.proto 的protobuf生成文件,若有条件可手动生成

3.参考地址

tensorflow serving github : https://github.com/tensorflow/serving

https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/setup.md

bazel http://www.bazel.io/ (需FQ)

4.Serving Framework

tensorflow serving

4.1.Train

训练模型的过程

4.2.exporter

负责将训练好的模型导出

4.3.Sever

负责存储操作,例如将对象存储到磁盘

4.4.Server

提供grpc server,组织request调用Module,将结果response client

4.5.ModuleManager

负责加载训练好的模型

4.6.Scheduler

负责请求的调度,例如BatchScheduler(buffer 某一批数据才发给Service)

4.7.client

负责发送Request请求接收Response

5.如何编写Serving

5.1 export model

模型训练完成后,需要export model

1) 需要确定 signature : (classification_signature,regression_signature,generic_signature)

classification_signature: input , classes , scores

regression_signature: input , output

generic_signature: map<string,tensor_name>

signature规定了输入和输出的tensor_name, 这个tensor_name应该对应到graph里的tensor

例如 mnist 为classification模型 训练输入了 x 训练出 y 则在export的时候使用:

signature = exporter.classification_signature(input_tensor=x, scores_tensor=y)

5.2 确定输出的路径

导出model需要一个可存储的路径,这个路径会在inference程序读取时使用

5.3 编写inference

inference主要流程:

1)获得    SessionBundle

std::unique_ptr<SessionBundleFactory> bundle_factory;
 
TF_QCHECK_OK(
 
SessionBundleFactory::Create(session_bundle_config, &bundle_factory));
 
std::unique_ptr<SessionBundle> bundle(new SessionBundle);
 
TF_QCHECK_OK(bundle_factory->CreateSessionBundle(bundle_path, &bundle));

2)  提供输入与输出的tensor

Tensor input(tensorflow::DT_FLOAT, {1, kImageDataSize});
 
    std::copy_n(request->image_data().begin(), kImageDataSize,
 
    input.flat<float>().data());
 
std::vector<Tensor> outputs;

3)  通过signatrue传入input,output到session_bundle并执行

const tensorflow::Status status = bundle_->session->Run(
 
        {{signature_.input().tensor_name(), input}},
 
        {signature_.scores().tensor_name()}, {}, &outputs);
 
上一篇:Android核心分析之二十一Android应用框架之AndroidApplication


下一篇:Windows Azure Virtual Network (12) 虚拟网络之间点对点连接VNet Peering