utree | R Documentation |
utree
implements recursive partitioning for uplift modeling.
## S3 method for class 'formula' utree(formula, data, na.action, classLevel = NULL, treatLevel = NULL, control = utree_control(...), ...) ## S3 method for class 'utree' print(x, ...) ## S3 method for class 'utree' nodeprune(x, ...)
formula |
A model formula of the form y ~ x1 + ....+ xn + trt(), where
the left-hand side corresponds to the observed response, the right-hand side
corresponds to the predictors, and 'trt' is the special expression to mark
the treatment term. At the moment, |
data |
A data frame in which to interpret the variables named in the formula. |
na.action |
A missing-data filter function. |
classLevel |
A character string for the class of interest. Defaults to the last level of the factor. |
treatLevel |
A character string for the treatment level of interest. Defaults to the last level of the treatment factor. |
control |
A list with control parameters, see |
... |
Arguments passed to |
x |
An object of class |
Roughly, the algorithm works as follows:
For each terminal node in the tree we test the global null hypothesis of no interaction effect between the treatment indicator and any of the covariates. Stop if this hypothesis cannot be rejected. Otherwise, select the input variable with strongest interaction effect. The interaction effect is measured by a p-value corresponding to an asymptotic or permutation test (Strasser and Weber, 1999) for the partial null hypothesis of independence between each covariate and a transformed response. Specifically, the response is transformed so the impact of the covariate on the response has a causal interpretation for the treatment effect (see details in Guelman et al. 2015)
Implement a binary split in the selected input variable.
Recursively repeate the two steps above.
Function nodeprune
is not yet implemented for utree
objects.
An object of class "utree"
.
Leo Guelman leo.guelman@gmail.com
Guelman, L., Guillen, M., and Perez-Marin A.M. (2015). "A decision support framework to implement optimal personalized marketing interventions." Decision Support Systems, Vol. 72, pp. 24–32.
Hothorn, T., Hornik, K. and Zeileis, A. (2006). "Unbiased recursive partitioning: A conditional inference framework". Journal of Computational and Graphical Statistics, 15(3): 651–674.
Rzepakowski, Piotr and Jaroszewicz, Szymon. (2011). "Decision trees for uplift modeling with single and multiple treatments". Knowledge and Information Systems, 32(2) 303–327.
Strasser, H. and Weber, C. (1999). "On the asymptotic theory of permutation statistics". Mathematical Methods of Statistics, 8: 220–250.
Su, X., Tsai, C.-L., Wang, H., Nickerson, D. M. and Li, B. (2009). "Subgroup Analysis via Recursive Partitioning". Journal of Machine Learning Research 10, 141–158.
plot.utree
.
set.seed(1) df <- sim_uplift(n = 1000, p = 50, response = "binary") form <- create_uplift_formula(x = names(df)[-c(1:3)], y = "y", trt = "T") fit <- utree(form, data = df, maxdepth = 3) fit
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.