describe: describe

Description Usage Arguments Value Note See Also Examples

Description

Computes statistics for numeric and string columns. If no columns are given, this function computes statistics for all numerical or string columns.

Usage

1
2
3
4
5
6
7
describe(x, col, ...)

## S4 method for signature 'SparkDataFrame,character'
describe(x, col, ...)

## S4 method for signature 'SparkDataFrame,ANY'
describe(x)

Arguments

x

a SparkDataFrame to be computed.

col

a string of name.

...

additional expressions.

Value

A SparkDataFrame.

Note

describe(SparkDataFrame, character) since 1.4.0

describe(SparkDataFrame) since 1.4.0

See Also

See summary for expanded statistics and control over which statistics to compute.

Other SparkDataFrame functions: SparkDataFrame-class, agg(), alias(), arrange(), as.data.frame(), attach,SparkDataFrame-method, broadcast(), cache(), checkpoint(), coalesce(), collect(), colnames(), coltypes(), createOrReplaceTempView(), crossJoin(), cube(), dapplyCollect(), dapply(), dim(), distinct(), dropDuplicates(), dropna(), drop(), dtypes(), exceptAll(), except(), explain(), filter(), first(), gapplyCollect(), gapply(), getNumPartitions(), group_by(), head(), hint(), histogram(), insertInto(), intersectAll(), intersect(), isLocal(), isStreaming(), join(), limit(), localCheckpoint(), merge(), mutate(), ncol(), nrow(), persist(), printSchema(), randomSplit(), rbind(), rename(), repartitionByRange(), repartition(), rollup(), sample(), saveAsTable(), schema(), selectExpr(), select(), showDF(), show(), storageLevel(), str(), subset(), summary(), take(), toJSON(), unionAll(), unionByName(), union(), unpersist(), withColumn(), withWatermark(), with(), write.df(), write.jdbc(), write.json(), write.orc(), write.parquet(), write.stream(), write.text()

Examples

1
2
3
4
5
6
7
8
9
## Not run: 
sparkR.session()
path <- "path/to/file.json"
df <- read.json(path)
describe(df)
describe(df, "col1")
describe(df, "col1", "col2")

## End(Not run)

SparkR documentation built on June 3, 2021, 5:05 p.m.