Man pages for SparkR
R Front End for 'Apache Spark'

AFTSurvivalRegressionModel-classS4 class that represents a AFTSurvivalRegressionModel
aliasalias
ALSModel-classS4 class that represents an ALSModel
approxQuantileCalculates the approximate quantiles of numerical columns of...
arrangeArrange Rows by Variables
as.data.frameDownload data from a SparkDataFrame into a R data.frame
attachAttach SparkDataFrame to R search path
avgavg
awaitTerminationawaitTermination
betweenbetween
BisectingKMeansModel-classS4 class that represents a BisectingKMeansModel
broadcastbroadcast
cacheCache
cacheTableCache Table
cancelJobGroupCancel active jobs for the specified group
castCasts the column to a different data type.
checkpointcheckpoint
clearCacheClear Cache
clearJobGroupClear current job group ID and its description
coalesceCoalesce
collectCollects all the elements of a SparkDataFrame and coerces...
coltypescoltypes
columnS4 class that represents a SparkDataFrame column
column_aggregate_functionsAggregate functions for Column operations
column_collection_functionsCollection functions for Column operations
column_datetime_diff_functionsDate time arithmetic functions for Column operations
column_datetime_functionsDate time functions for Column operations
columnfunctionsA set of operations working with SparkDataFrame columns
column_math_functionsMath functions for Column operations
column_misc_functionsMiscellaneous functions for Column operations
column_nonaggregate_functionsNon-aggregate functions for Column operations
columnsColumn Names of SparkDataFrame
column_string_functionsString functions for Column operations
column_window_functionsWindow functions for Column operations
corrcorr
countCount
covcov
createDataFrameCreate a SparkDataFrame
createExternalTable-deprecated(Deprecated) Create an external table
createOrReplaceTempViewCreates a temporary view using the given name.
createTableCreates a table based on the dataset in a data source
crossJoinCrossJoin
crosstabComputes a pair-wise frequency table of the given columns
cubecube
currentDatabaseReturns the current default database
dapplydapply
dapplyCollectdapplyCollect
DecisionTreeClassificationModel-classS4 class that represents a DecisionTreeClassificationModel
DecisionTreeRegressionModel-classS4 class that represents a DecisionTreeRegressionModel
describedescribe
dimReturns the dimensions of SparkDataFrame
distinctDistinct
dropdrop
dropDuplicatesdropDuplicates
dropTempTable-deprecated(Deprecated) Drop Temporary Table
dropTempViewDrops the temporary view with the given view name in the...
dtypesDataTypes
endsWithendsWith
eq_null_safe%<=>%
exceptexcept
exceptAllexceptAll
explainExplain
filterFilter
firstReturn the first row of a SparkDataFrame
fittedGet fitted result from a k-means model
FPGrowthModel-classS4 class that represents a FPGrowthModel
freqItemsFinding frequent items for columns, possibly with false...
gapplygapply
gapplyCollectgapplyCollect
GaussianMixtureModel-classS4 class that represents a GaussianMixtureModel
GBTClassificationModel-classS4 class that represents a GBTClassificationModel
GBTRegressionModel-classS4 class that represents a GBTRegressionModel
GeneralizedLinearRegressionModel-classS4 class that represents a generalized linear model
getLocalPropertyGet a local property set in this thread, or 'NULL' if it is...
getNumPartitionsgetNumPartitions
glmGeneralized Linear Models (R-compliant)
groupByGroupBy
GroupedDataS4 class that represents a GroupedData
hashCodeCompute the hashCode of an object
headHead
hinthint
histogramCompute histogram statistics for given column
insertIntoinsertInto
install.sparkDownload and Install Apache Spark to a Local Directory
intersectIntersect
intersectAllintersectAll
isActiveisActive
isLocalisLocal
IsotonicRegressionModel-classS4 class that represents an IsotonicRegressionModel
isStreamingisStreaming
joinJoin
KMeansModel-classS4 class that represents a KMeansModel
KSTest-classS4 class that represents an KSTest
lastlast
lastProgresslastProgress
LDAModel-classS4 class that represents an LDAModel
limitLimit
LinearSVCModel-classS4 class that represents an LinearSVCModel
listColumnsReturns a list of columns for the given table/view in the...
listDatabasesReturns a list of databases available
listFunctionsReturns a list of functions registered in the specified...
listTablesReturns a list of tables or views in the specified database
localCheckpointlocalCheckpoint
LogisticRegressionModel-classS4 class that represents an LogisticRegressionModel
matchMatch a column with given values.
mergeMerges two data frames
MultilayerPerceptronClassificationModel-classS4 class that represents a...
mutateMutate
nafunctionsA set of SparkDataFrame functions working with NA values
NaiveBayesModel-classS4 class that represents a NaiveBayesModel
ncolReturns the number of columns in a SparkDataFrame
not!
nrowReturns the number of rows in a SparkDataFrame
orderByOrdering Columns in a WindowSpec
otherwiseotherwise
overover
partitionBypartitionBy
persistPersist
pivotPivot a column of the GroupedData and perform the specified...
predictMakes predictions from a MLlib model
print.jobjPrint a JVM object reference.
printSchemaPrint Schema of a SparkDataFrame
print.structFieldPrint a Spark StructField.
print.structTypePrint a Spark StructType.
queryNamequeryName
RandomForestClassificationModel-classS4 class that represents a RandomForestClassificationModel
RandomForestRegressionModel-classS4 class that represents a RandomForestRegressionModel
randomSplitrandomSplit
rangeBetweenrangeBetween
rbindUnion two or more SparkDataFrames
read.dfLoad a SparkDataFrame
read.jdbcCreate a SparkDataFrame representing the database table...
read.jsonCreate a SparkDataFrame from a JSON file.
read.mlLoad a fitted MLlib model from the input path.
read.orcCreate a SparkDataFrame from an ORC file.
read.parquetCreate a SparkDataFrame from a Parquet file.
read.streamLoad a streaming SparkDataFrame
read.textCreate a SparkDataFrame from a text file.
recoverPartitionsRecovers all the partitions in the directory of a table and...
refreshByPathInvalidates and refreshes all the cached data and metadata...
refreshTableInvalidates and refreshes all the cached data and metadata of...
registerTempTable-deprecated(Deprecated) Register Temporary Table
renamerename
repartitionRepartition
repartitionByRangeRepartition by range
rolluprollup
rowsBetweenrowsBetween
sampleSample
sampleByReturns a stratified sample without replacement
saveAsTableSave the contents of the SparkDataFrame to a data source as a...
schemaGet schema object
selectSelect
selectExprSelectExpr
setCheckpointDirSet checkpoint directory
setCurrentDatabaseSets the current default database
setJobDescriptionSet a human readable description of the current job.
setJobGroupAssigns a group ID to all the jobs started by this thread...
setLocalPropertySet a local property that affects jobs submitted from this...
setLogLevelSet new log level
showshow
showDFshowDF
spark.addFileAdd a file or directory to be downloaded with this Spark job...
spark.alsAlternating Least Squares (ALS) for Collaborative Filtering
spark.bisectingKmeansBisecting K-Means Clustering Model
SparkDataFrameS4 class that represents a SparkDataFrame
spark.decisionTreeDecision Tree Model for Regression and Classification
spark.fpGrowthFP-growth
spark.gaussianMixtureMultivariate Gaussian Mixture Model (GMM)
spark.gbtGradient Boosted Tree Model for Regression and Classification
spark.getSparkFilesGet the absolute path of a file added through spark.addFile.
spark.getSparkFilesRootDirectoryGet the root directory that contains files added through...
spark.glmGeneralized Linear Models
spark.isoregIsotonic Regression Model
spark.kmeansK-Means Clustering Model
spark.kstest(One-Sample) Kolmogorov-Smirnov Test
spark.lapplyRun a function over a list of elements, distributing the...
spark.ldaLatent Dirichlet Allocation
spark.logitLogistic Regression Model
spark.mlpMultilayer Perceptron Classification Model
spark.naiveBayesNaive Bayes Models
spark.randomForestRandom Forest Model for Regression and Classification
sparkR.callJMethodCall Java Methods
sparkR.callJStaticCall Static Java Methods
sparkR.confGet Runtime Config from the current active SparkSession
sparkRHive.init-deprecated(Deprecated) Initialize a new HiveContext
sparkR.init-deprecated(Deprecated) Initialize a new Spark Context
sparkR.newJObjectCreate Java Objects
sparkR.sessionGet the existing SparkSession or initialize a new...
sparkR.session.stopStop the Spark Session and Spark Context
sparkRSQL.init-deprecated(Deprecated) Initialize a new SQLContext
sparkR.uiWebUrlGet the URL of the SparkUI instance for the current active...
sparkR.versionGet version of Spark on which this application is running
spark.survregAccelerated Failure Time (AFT) Survival Regression Model
spark.svmLinearLinear SVM Model
sqlSQL Query
startsWithstartsWith
statusstatus
stopQuerystopQuery
storageLevelStorageLevel
strCompactly display the structure of a dataset
StreamingQueryS4 class that represents a StreamingQuery
structFieldstructField
structTypestructType
subsetSubset
substrsubstr
summarizesummarize
summarysummary
tableNamesTable Names
tablesTables
tableToDFCreate a SparkDataFrame from a SparkSQL table or view
takeTake the first NUM rows of a SparkDataFrame and return the...
toJSONtoJSON
uncacheTableUncache Table
unionReturn a new SparkDataFrame containing the union of rows
unionByNameReturn a new SparkDataFrame containing the union of rows,...
unpersistUnpersist
windowOrderBywindowOrderBy
windowPartitionBywindowPartitionBy
WindowSpecS4 class that represents a WindowSpec
withEvaluate a R expression in an environment constructed from a...
withColumnWithColumn
withWatermarkwithWatermark
write.dfSave the contents of SparkDataFrame to a data source.
write.jdbcSave the content of SparkDataFrame to an external database...
write.jsonSave the contents of SparkDataFrame as a JSON file
write.mlSaves the MLlib model to the input path
write.orcSave the contents of SparkDataFrame as an ORC file,...
write.parquetSave the contents of SparkDataFrame as a Parquet file,...
write.streamWrite the streaming SparkDataFrame to a data source.
write.textSave the content of SparkDataFrame in a text file at the...
SparkR documentation built on Sept. 2, 2019, 5:05 p.m.