xgboostRF: xgboost random forests algorithm

Description Usage Arguments Value See Also Examples

View source: R/classification.R

Description

Scalable and Flexible Gradient Boosting XGBoost is short for “Extreme Gradient Boosting”, where the term “Gradient Boosting” is proposed in the paper Greedy Function Approximation: A Gradient Boosting Machine, by Friedman. XGBoost is based on this original model. This is a function using gradient boosted trees for privacyEC.

Usage

1
2
3
4
xgboostRF(train.ds = NULL, holdout.ds = NULL, validation.ds = NULL,
  label = "class", cv.folds = NULL, num.threads = 2, num.rounds = c(1),
  max.depth = c(4), shrinkage = c(1), objective = "binary:logistic",
  save.file = NULL, verbose = FALSE)

Arguments

train.ds

A data frame with training data and outcome labels

holdout.ds

A data frame with holdout data and outcome labels

validation.ds

A data frame with validation data and outcome labels

label

A character vector of the outcome variable column name

cv.folds

An integer for the number of cross validation folds

num.threads

An integer for OpenMP number of cores

num.rounds

An integer number of xgboost boosting iterations

max.depth

An integer aximum tree depth

shrinkage

A numeric gradient learning rate 0-1

save.file

A character vector for results filename or NULL to skip

verbose

A flag indicating whether verbose output be sent to stdout

Value

A list containing:

algo.acc

data frame of results, a row for each update

ggplot.data

melted results data frame for plotting

trn.model

xgboost model

elapsed

total elapsed time

See Also

Other classification: epistasisRank, getImportanceScores, originalThresholdout, privateEC, privateRF, standardRF

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
num.samples <- 100
num.variables <- 100
pct.signals <- 0.1
sim.data <- createSimulation(num.variables = num.variables,
                             num.samples = num.samples,
                             sim.type = "mainEffect",
                             pct.train = 1 / 3,
                             pct.holdout = 1 / 3,
                             pct.validation = 1 / 3,
                             verbose = FALSE)
rra.results <- xgboostRF(train.ds = sim.data$train,
                         holdout.ds = sim.data$holdout,
                         validation.ds = sim.data$validation,
                         label = sim.data$label,
                         num.rounds = c(1),
                         max.depth = c(10),
                         is.simulated = TRUE,
                         verbose = FALSE,
                         signal.names = sim.data$signal.names)

hexhead/privateEC documentation built on July 20, 2018, 12:30 p.m.