View source: R/databricks_execute.R
databricks_execute | R Documentation |
This function sends commands to an execution context on an existing
Databricks cluster via REST API. It requires a context_id from
create_execution_context
. Commands must be compatible with the
language of the execution context - 'r', 'python', 'scala', or 'sql'.
Will attempt to return a data.frame but if the execution hasn't finished will return
the status of execution. If your command does not return a data.frame output may
vary considerably, or fail.
databricks_execute(command, context, verbose = F, ...)
command |
A string containing commands for remote execution on Databricks. |
context |
The list generated by |
verbose |
If TRUE, will print the API response to the console. Defaults to FALSE. |
... |
Additional options to be passed to |
The API endpoint for creating the execution context is is '1.2/commands/execute'. For all details on API calls please see the official documentation at https://docs.databricks.com/dev-tools/api/latest/.
A list with two components:
response - The full API response.
data - The data as a data.frame.
# Using netrc context <- create_execution_context(workspace = "https://eastus2.azuredatabricks.net", language = "r", cluster_id = "1017-337483-jars232") ## Use the context to execute a command on Databricks command <- "iris[1, ]" result <- databricks_execute(command, context) ## Access dataframe result$data
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.