TensorFlow Tutorial : Serving

Tensorflow Serving 现在被划入到 TFX ( TensorFlow Extension )
目前的链接: [TFX – Serving][https://www.tensorflow.org/tfx/guide/serving]

链接下有多个页面
Serving Models
TensorFlow Serving with Docker
[Installation]
[Serving a TensorFlow Model]
[Architecture]
[Advanced model server configuration]
[Build a TensorFlow ModelServer]
Use TensorFlow Serving with Kubernetes
Create a new kind of servable
Create a module that discovers new servable paths
SigunatureDefs in Saved Model for TensorFlow Serving

Serving Models

官方链接 : Serving models

TensorFlow Serving with Docker

官方链接 : Tensorflow Serving with Docker Installation

Installation

Serving a TensorFlow Model

SavedModel提供Serving服务包含两种形式, 一种是接收client端传入的Example形式数据, 需要在后端进行转换后输入模型进行预测, 另一种是接收传入的Tensor形式数据.

Architecture

Advanced model server configuration

Self-Summay

平时的存储模型可以直接调用tf.train.Saver类进行存储和加载.
但是如果需要使用Serving服务时, 则需要SavedModel.

一般简单的情况下使用 tf.saved_model.simgple_save就可以.
复杂的情况下, 需要使用 tf.saved_model.builder.SavedModelBuilder.

Estimator 导出为 SavedModel 格式.

发表评论

此站点使用Akismet来减少垃圾评论。了解我们如何处理您的评论数据