serve_savedmodel: Serve a SavedModel

Description Usage Arguments See Also Examples

Description

Serve a TensorFlow SavedModel as a local web api.

Usage

1
2
serve_savedmodel(model_dir, host = "127.0.0.1", port = 8089,
  daemonized = FALSE, browse = !daemonized)

Arguments

model_dir

The path to the exported model, as a string.

host

Address to use to serve model, as a string.

port

Port to use to serve model, as numeric.

daemonized

Makes 'httpuv' server daemonized so R interactive sessions are not blocked to handle requests. To terminate a daemonized server, call 'httpuv::stopDaemonizedServer()' with the handle returned from this call.

browse

Launch browser with serving landing page?

See Also

export_savedmodel()

Examples

1
2
3
4
5
6
7
## Not run: 
# serve an existing model over a web interface
tfdeploy::serve_savedmodel(
  system.file("models/tensorflow-mnist", package = "tfdeploy")
)

## End(Not run)

tfdeploy documentation built on June 14, 2019, 5:04 p.m.