Description Usage Arguments Details Value References See Also Examples
Modelbased recursive partitioning based on (generalized) linear models with some local (i.e., leafspecific) and some global (i.e., constant throughout the tree) regression coefficients.
1 2 3 
formula 
formula specifying the response variable and a threepart righthandside describing the local (i.e., leafspecific) regressors, the global regressors (i.e., with constant coefficients throughout the tree), and partitioning variables, respectively. For details see below. 
data 
data.frame to be used for estimating the model tree. 
weights 
numeric. An optional numeric vector of weights. (Note that
this is passed with standard evaluation, i.e., it is not enough to pass
the name of a column in 
family 
either 
lmstart 
numeric. A vector of length 
abstol 
numeric. The convergence criterion used for estimation of the model.
When the difference in loglikelihoods of the model from two consecutive
iterations is smaller than 
maxit 
numeric. The maximum number of iterations to be performed in estimation of the model tree. 
dfsplit 
logical or numeric. 
verbose 
Should the loglikelihood value of the estimated model be printed for every iteration of the estimation? 
plot 
Should the tree be plotted at every iteration of the estimation? Note that selecting this option slows down execution of the function. 
... 
Additional arguments to be passed to 
Partially additive (generalized) linear model (PALM) trees learn a tree where each terminal node is associated with different regression coefficients while adjusting for additional global regression effects. This allows for detection of subgroupspecific coefficients with respect to selected covariates, while keeping the remaining regression coefficients constant throughout the tree. The estimation algorithm iterates between (1) estimation of the tree given an offset of the global effects, and (2) estimation of the global regression effects given the tree structure.
To specify all variables in the model a formula
such as
y ~ x1 + x2  x3  z1 + z2 + z3
is used, where y
is the
response, x1
and x2
are the regressors in every node of the
tree, x3
has a global regression coefficients, and z1
to z3
are the partitioning variables considered for growing the tree.
The code is still under development and might change in future versions.
The function returns a list with the following objects:
formula 
The formula as specified with the 
call 
the matched call. 
tree 
The final 
palm 
The final 
data 
The dataset specified with the 
nobs 
Number of observations. 
loglik 
The loglikelihood value of the last iteration. 
df 
Degrees of freedom. 
dfsplit 
degrees of freedom per selected split as specified with the 
iterations 
The number of iterations used to estimate the 
maxit 
The maximum number of iterations specified with the 
lmstart 
Offset in estimation of the first tree as specified in the 
abstol 
The prespecified value for the change in loglikelihood to evaluate
convergence, as specified with the 
intercept 
Logical specifying if an intercept was computed. 
family 
The 
mob.control 
A list containing control parameters passed to

Sies A, Van Mechelen I (2015). Comparing Four Methods for Estimating TreeBased Treatment Regimes. Unpublished Manuscript.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33  ## one DGP from Sies and Van Mechelen (2015)
dgp < function(nobs = 1000, nreg = 5, creg = 0.4, ptreat = 0.5, sd = 1,
coef = c(1, 0.25, 0.25, 0, 0, 0.25), eff = 1)
{
d < mvtnorm::rmvnorm(nobs,
mean = rep(0, nreg),
sigma = diag(1  creg, nreg) + creg)
colnames(d) < paste0("x", 1:nreg)
d < as.data.frame(d)
d$a < rbinom(nobs, size = 1, prob = ptreat)
d$err < rnorm(nobs, mean = 0, sd = sd)
gopt < function(d) {
as.numeric(d$x1 > 0.545) * as.numeric(d$x2 < 0.545)
}
d$y < coef[1] + drop(as.matrix(d[, paste0("x", 1:5)]) %*% coef[1]) 
eff * (d$a  gopt(d))^2 + d$err
d$a < factor(d$a)
return(d)
}
set.seed(1)
d < dgp()
## estimate PALM tree with correctly specified global (partially
## additive) regressors and all variables considered for partitioning
palm < palmtree(y ~ a  x1 + x2 + x5  x1 + x2 + x3 + x4 + x5, data = d)
print(palm)
plot(palm)
## query coefficients
coef(palm, model = "tree")
coef(palm, model = "palm")
coef(palm, model = "all")

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.