| AFTSurvivalRegressionModel-class | S4 class that represents a AFTSurvivalRegressionModel |
| alias | alias |
| ALSModel-class | S4 class that represents an ALSModel |
| approxQuantile | Calculates the approximate quantiles of numerical columns of... |
| arrange | Arrange Rows by Variables |
| as.data.frame | Download data from a SparkDataFrame into a R data.frame |
| attach | Attach SparkDataFrame to R search path |
| avg | avg |
| awaitTermination | awaitTermination |
| between | between |
| BisectingKMeansModel-class | S4 class that represents a BisectingKMeansModel |
| broadcast | broadcast |
| cache | Cache |
| cacheTable | Cache Table |
| cancelJobGroup | Cancel active jobs for the specified group |
| cast | Casts the column to a different data type. |
| checkpoint | checkpoint |
| clearCache | Clear Cache |
| clearJobGroup | Clear current job group ID and its description |
| coalesce | Coalesce |
| collect | Collects all the elements of a SparkDataFrame and coerces... |
| coltypes | coltypes |
| column | S4 class that represents a SparkDataFrame column |
| column_aggregate_functions | Aggregate functions for Column operations |
| column_avro_functions | Avro processing functions for Column operations |
| column_collection_functions | Collection functions for Column operations |
| column_datetime_diff_functions | Date time arithmetic functions for Column operations |
| column_datetime_functions | Date time functions for Column operations |
| columnfunctions | A set of operations working with SparkDataFrame columns |
| column_math_functions | Math functions for Column operations |
| column_misc_functions | Miscellaneous functions for Column operations |
| column_ml_functions | ML functions for Column operations |
| column_nonaggregate_functions | Non-aggregate functions for Column operations |
| columns | Column Names of SparkDataFrame |
| column_string_functions | String functions for Column operations |
| column_window_functions | Window functions for Column operations |
| corr | corr |
| count | Count |
| cov | cov |
| createDataFrame | Create a SparkDataFrame |
| createExternalTable-deprecated | (Deprecated) Create an external table |
| create_lambda | Create o.a.s.sql.expressions.LambdaFunction corresponding to... |
| createOrReplaceTempView | Creates a temporary view using the given name. |
| createTable | Creates a table based on the dataset in a data source |
| crossJoin | CrossJoin |
| crosstab | Computes a pair-wise frequency table of the given columns |
| cube | cube |
| currentDatabase | Returns the current default database |
| dapply | dapply |
| dapplyCollect | dapplyCollect |
| DecisionTreeClassificationModel-class | S4 class that represents a DecisionTreeClassificationModel |
| DecisionTreeRegressionModel-class | S4 class that represents a DecisionTreeRegressionModel |
| describe | describe |
| dim | Returns the dimensions of SparkDataFrame |
| distinct | Distinct |
| drop | drop |
| dropDuplicates | dropDuplicates |
| dropFields | dropFields |
| dropTempTable-deprecated | (Deprecated) Drop Temporary Table |
| dropTempView | Drops the temporary view with the given view name in the... |
| dtypes | DataTypes |
| endsWith | endsWith |
| eq_null_safe | %<=>% |
| except | except |
| exceptAll | exceptAll |
| explain | Explain |
| filter | Filter |
| first | Return the first row of a SparkDataFrame |
| fitted | Get fitted result from a k-means model |
| FMClassificationModel-class | S4 class that represents a FMClassificationModel |
| FMRegressionModel-class | S4 class that represents a FMRegressionModel |
| FPGrowthModel-class | S4 class that represents a FPGrowthModel |
| freqItems | Finding frequent items for columns, possibly with false... |
| gapply | gapply |
| gapplyCollect | gapplyCollect |
| GaussianMixtureModel-class | S4 class that represents a GaussianMixtureModel |
| GBTClassificationModel-class | S4 class that represents a GBTClassificationModel |
| GBTRegressionModel-class | S4 class that represents a GBTRegressionModel |
| GeneralizedLinearRegressionModel-class | S4 class that represents a generalized linear model |
| getLocalProperty | Get a local property set in this thread, or 'NULL' if it is... |
| getNumPartitions | getNumPartitions |
| glm | Generalized Linear Models (R-compliant) |
| groupBy | GroupBy |
| GroupedData | S4 class that represents a GroupedData |
| hashCode | Compute the hashCode of an object |
| head | Head |
| hint | hint |
| histogram | Compute histogram statistics for given column |
| insertInto | insertInto |
| install.spark | Download and Install Apache Spark to a Local Directory |
| intersect | Intersect |
| intersectAll | intersectAll |
| invoke_higher_order_function | Invokes higher order function expression identified by name,... |
| isActive | isActive |
| isLocal | isLocal |
| IsotonicRegressionModel-class | S4 class that represents an IsotonicRegressionModel |
| isStreaming | isStreaming |
| join | Join |
| KMeansModel-class | S4 class that represents a KMeansModel |
| KSTest-class | S4 class that represents an KSTest |
| last | last |
| lastProgress | lastProgress |
| LDAModel-class | S4 class that represents an LDAModel |
| limit | Limit |
| LinearRegressionModel-class | S4 class that represents a LinearRegressionModel |
| LinearSVCModel-class | S4 class that represents an LinearSVCModel |
| listColumns | Returns a list of columns for the given table/view in the... |
| listDatabases | Returns a list of databases available |
| listFunctions | Returns a list of functions registered in the specified... |
| listTables | Returns a list of tables or views in the specified database |
| localCheckpoint | localCheckpoint |
| LogisticRegressionModel-class | S4 class that represents an LogisticRegressionModel |
| match | Match a column with given values. |
| merge | Merges two data frames |
| MultilayerPerceptronClassificationModel-class | S4 class that represents a... |
| mutate | Mutate |
| nafunctions | A set of SparkDataFrame functions working with NA values |
| NaiveBayesModel-class | S4 class that represents a NaiveBayesModel |
| ncol | Returns the number of columns in a SparkDataFrame |
| not | ! |
| nrow | Returns the number of rows in a SparkDataFrame |
| orderBy | Ordering Columns in a WindowSpec |
| otherwise | otherwise |
| over | over |
| partitionBy | partitionBy |
| persist | Persist |
| pivot | Pivot a column of the GroupedData and perform the specified... |
| PowerIterationClustering-class | S4 class that represents a PowerIterationClustering |
| predict | Makes predictions from a MLlib model |
| PrefixSpan-class | S4 class that represents a PrefixSpan |
| print.jobj | Print a JVM object reference. |
| printSchema | Print Schema of a SparkDataFrame |
| print.structField | Print a Spark StructField. |
| print.structType | Print a Spark StructType. |
| queryName | queryName |
| RandomForestClassificationModel-class | S4 class that represents a RandomForestClassificationModel |
| RandomForestRegressionModel-class | S4 class that represents a RandomForestRegressionModel |
| randomSplit | randomSplit |
| rangeBetween | rangeBetween |
| rbind | Union two or more SparkDataFrames |
| read.df | Load a SparkDataFrame |
| read.jdbc | Create a SparkDataFrame representing the database table... |
| read.json | Create a SparkDataFrame from a JSON file. |
| read.ml | Load a fitted MLlib model from the input path. |
| read.orc | Create a SparkDataFrame from an ORC file. |
| read.parquet | Create a SparkDataFrame from a Parquet file. |
| read.stream | Load a streaming SparkDataFrame |
| read.text | Create a SparkDataFrame from a text file. |
| recoverPartitions | Recovers all the partitions in the directory of a table and... |
| refreshByPath | Invalidates and refreshes all the cached data and metadata... |
| refreshTable | Invalidates and refreshes all the cached data and metadata of... |
| registerTempTable-deprecated | (Deprecated) Register Temporary Table |
| rename | rename |
| repartition | Repartition |
| repartitionByRange | Repartition by range |
| rollup | rollup |
| rowsBetween | rowsBetween |
| sample | Sample |
| sampleBy | Returns a stratified sample without replacement |
| saveAsTable | Save the contents of the SparkDataFrame to a data source as a... |
| schema | Get schema object |
| select | Select |
| selectExpr | SelectExpr |
| setCheckpointDir | Set checkpoint directory |
| setCurrentDatabase | Sets the current default database |
| setJobDescription | Set a human readable description of the current job. |
| setJobGroup | Assigns a group ID to all the jobs started by this thread... |
| setLocalProperty | Set a local property that affects jobs submitted from this... |
| setLogLevel | Set new log level |
| show | show |
| showDF | showDF |
| spark.addFile | Add a file or directory to be downloaded with this Spark job... |
| spark.als | Alternating Least Squares (ALS) for Collaborative Filtering |
| spark.bisectingKmeans | Bisecting K-Means Clustering Model |
| SparkDataFrame | S4 class that represents a SparkDataFrame |
| spark.decisionTree | Decision Tree Model for Regression and Classification |
| spark.fmClassifier | Factorization Machines Classification Model |
| spark.fmRegressor | Factorization Machines Regression Model |
| spark.fpGrowth | FP-growth |
| spark.gaussianMixture | Multivariate Gaussian Mixture Model (GMM) |
| spark.gbt | Gradient Boosted Tree Model for Regression and Classification |
| spark.getSparkFiles | Get the absolute path of a file added through spark.addFile. |
| spark.getSparkFilesRootDirectory | Get the root directory that contains files added through... |
| spark.glm | Generalized Linear Models |
| spark.isoreg | Isotonic Regression Model |
| spark.kmeans | K-Means Clustering Model |
| spark.kstest | (One-Sample) Kolmogorov-Smirnov Test |
| spark.lapply | Run a function over a list of elements, distributing the... |
| spark.lda | Latent Dirichlet Allocation |
| spark.lm | Linear Regression Model |
| spark.logit | Logistic Regression Model |
| spark.mlp | Multilayer Perceptron Classification Model |
| spark.naiveBayes | Naive Bayes Models |
| spark.powerIterationClustering | PowerIterationClustering |
| spark.prefixSpan | PrefixSpan |
| spark.randomForest | Random Forest Model for Regression and Classification |
| sparkR.callJMethod | Call Java Methods |
| sparkR.callJStatic | Call Static Java Methods |
| sparkR.conf | Get Runtime Config from the current active SparkSession |
| sparkRHive.init-deprecated | (Deprecated) Initialize a new HiveContext |
| sparkR.init-deprecated | (Deprecated) Initialize a new Spark Context |
| sparkR.newJObject | Create Java Objects |
| sparkR.session | Get the existing SparkSession or initialize a new... |
| sparkR.session.stop | Stop the Spark Session and Spark Context |
| sparkRSQL.init-deprecated | (Deprecated) Initialize a new SQLContext |
| sparkR.uiWebUrl | Get the URL of the SparkUI instance for the current active... |
| sparkR.version | Get version of Spark on which this application is running |
| spark.survreg | Accelerated Failure Time (AFT) Survival Regression Model |
| spark.svmLinear | Linear SVM Model |
| sql | SQL Query |
| startsWith | startsWith |
| status | status |
| stopQuery | stopQuery |
| storageLevel | StorageLevel |
| str | Compactly display the structure of a dataset |
| StreamingQuery | S4 class that represents a StreamingQuery |
| structField | structField |
| structType | structType |
| subset | Subset |
| substr | substr |
| summarize | summarize |
| summary | summary |
| tableNames | Table Names |
| tables | Tables |
| tableToDF | Create a SparkDataFrame from a SparkSQL table or view |
| take | Take the first NUM rows of a SparkDataFrame and return the... |
| toJSON | toJSON |
| uncacheTable | Uncache Table |
| union | Return a new SparkDataFrame containing the union of rows |
| unionAll | Return a new SparkDataFrame containing the union of rows. |
| unionByName | Return a new SparkDataFrame containing the union of rows,... |
| unpersist | Unpersist |
| unresolved_named_lambda_var | Create o.a.s.sql.expressions.UnresolvedNamedLambdaVariable,... |
| windowOrderBy | windowOrderBy |
| windowPartitionBy | windowPartitionBy |
| WindowSpec | S4 class that represents a WindowSpec |
| with | Evaluate a R expression in an environment constructed from a... |
| withColumn | WithColumn |
| withField | withField |
| withWatermark | withWatermark |
| write.df | Save the contents of SparkDataFrame to a data source. |
| write.jdbc | Save the content of SparkDataFrame to an external database... |
| write.json | Save the contents of SparkDataFrame as a JSON file |
| write.ml | Saves the MLlib model to the input path |
| write.orc | Save the contents of SparkDataFrame as an ORC file,... |
| write.parquet | Save the contents of SparkDataFrame as a Parquet file,... |
| write.stream | Write the streaming SparkDataFrame to a data source. |
| write.text | Save the content of SparkDataFrame in a text file at the... |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.