View source: R/spark_compile.R
spark_compilation_spec | R Documentation |
For use with compile_package_jars
. The Spark compilation
specification is used when compiling Spark extension Java Archives, and
defines which versions of Spark, as well as which versions of Scala, should
be used for compilation.
spark_compilation_spec(
spark_version = NULL,
spark_home = NULL,
scalac_path = NULL,
scala_filter = NULL,
jar_name = NULL,
jar_path = NULL,
jar_dep = NULL,
embedded_srcs = "embedded_sources.R"
)
spark_version |
The Spark version to build against. This can be left unset if the path to a suitable Spark home is supplied. |
spark_home |
The path to a Spark home installation. This can
be left unset if |
scalac_path |
The path to the |
scala_filter |
An optional R function that can be used to filter
which |
jar_name |
The name to be assigned to the generated |
jar_path |
The path to the |
jar_dep |
An optional list of additional |
embedded_srcs |
Embedded source file(s) under |
Most Spark extensions won't need to define their own compilation specification,
and can instead rely on the default behavior of compile_package_jars
.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.