Description Usage Arguments See Also Examples
Runs a prediction over a saved model file, web API or graph object.
1 | predict_savedmodel(instances, model, ...)
|
instances |
A list of prediction instances to be passed as input tensors to the service. Even for single predictions, a list with one entry is expected. |
model |
The model as a local path, a REST url or graph object. A local path can be exported using A |
... |
See #' @section Implementations:
|
export_savedmodel()
, serve_savedmodel()
, load_savedmodel()
1 2 3 4 5 6 7 8 | ## Not run:
# perform prediction based on an existing model
tfdeploy::predict_savedmodel(
list(rep(9, 784)),
system.file("models/tensorflow-mnist", package = "tfdeploy")
)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.