uboost: U-learner implemented via xgboost (boosting)

Description Usage Arguments Examples

View source: R/uboost.R

Description

U-learner as proposed by Kunzel, Sekhon, Bickel, and Yu (2017), implemented via xgboost (boosting)

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
uboost(
  x,
  w,
  y,
  k_folds = NULL,
  p_hat = NULL,
  m_hat = NULL,
  cutoff = 0.05,
  ntrees_max = 1000,
  num_search_rounds = 10,
  print_every_n = 100,
  early_stopping_rounds = 10,
  nthread = NULL,
  verbose = FALSE
)

Arguments

x

the input features

w

the treatment variable (0 or 1)

y

the observed response (real valued)

k_folds

number of folds used for cross fitting and cross validation

p_hat

pre-computed estimates on E[W|X] corresponding to the input x. uboost will compute it internally if not provided.

m_hat

pre-computed estimates on E[Y|X] corresponding to the input x. uboost will compute it internally if not provided.

cutoff

the threshold to cutoff propensity estimate

ntrees_max

the maximum number of trees to grow for xgboost

num_search_rounds

the number of random sampling of hyperparameter combinations for cross validating on xgboost trees

print_every_n

the number of iterations (in each iteration, a tree is grown) by which the code prints out information

early_stopping_rounds

the number of rounds the test error stops decreasing by which the cross validation in finding the optimal number of trees stops

nthread

the number of threads to use. The default is NULL, which uses all available threads

verbose

boolean; whether to print statistic

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
## Not run: 
n = 100; p = 10

x = matrix(rnorm(n*p), n, p)
w = rbinom(n, 1, 0.5)
y = pmax(x[,1], 0) * w + x[,2] + pmin(x[,3], 0) + rnorm(n)

uboost_fit = uboost(x, y, w)
uboost_est = predict(uboost_fit, x)

## End(Not run)

xnie/rlearner documentation built on April 11, 2021, 12:49 a.m.