spark_install: Download and install various versions of Spark

View source: R/install_spark.R

spark_installR Documentation

Download and install various versions of Spark

Description

Install versions of Spark for use with local Spark connections (i.e. spark_connect(master = "local")

Usage

spark_install(
  version = NULL,
  hadoop_version = NULL,
  reset = TRUE,
  logging = "INFO",
  verbose = interactive()
)

spark_uninstall(version, hadoop_version)

spark_install_dir()

spark_install_tar(tarfile)

spark_installed_versions()

spark_available_versions(
  show_hadoop = FALSE,
  show_minor = FALSE,
  show_future = FALSE
)

Arguments

version

Version of Spark to install. See spark_available_versions for a list of supported versions

hadoop_version

Version of Hadoop to install. See spark_available_versions for a list of supported versions

reset

Attempts to reset settings to defaults.

logging

Logging level to configure install. Supported options: "WARN", "INFO"

verbose

Report information as Spark is downloaded / installed

tarfile

Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.

show_hadoop

Show Hadoop distributions?

show_minor

Show minor Spark versions?

show_future

Should future versions which have not been released be shown?

Value

List with information about the installed version.


sparklyr documentation built on May 29, 2024, 2:58 a.m.