ipwtm | R Documentation |
Estimate inverse probability weights to fit marginal structural models, with a time-varying exposure and time-varying confounders. Within each unit under observation this function computes inverse probability weights at each time point during follow-up. The exposure can be binomial, multinomial, ordinal or continuous. Both stabilized and unstabilized weights can be estimated.
ipwtm(exposure, family, link, numerator = NULL, denominator, id,
tstart, timevar, type, data, corstr = "ar1", trunc = NULL,
...)
exposure |
vector, representing the exposure of interest. Both numerical and categorical variables can be used. A binomial exposure variable should be coded using values |
family |
specifies a family of link functions, used to model the relationship between the variables in |
link |
specifies the specific link function between the variables in |
numerator |
is a formula, specifying the right-hand side of the model used to estimate the elements in the numerator of the inverse probability weights. When left unspecified, unstabilized weights with a numerator of 1 are estimated. |
denominator |
is a formula, specifying the right-hand side of the model used to estimate the elements in the denominator of the inverse probability weights. |
id |
vector, uniquely identifying the units under observation (typically patients) within which the longitudinal measurements are taken. |
tstart |
numerical vector, representing the starting time of follow-up intervals, using the counting process notation. This argument is only needed when |
timevar |
numerical vector, representing follow-up time, starting at |
type |
specifies the type of exposure. Alternatives are |
data |
dataframe containing |
corstr |
correlation structure, only needed when using |
trunc |
optional truncation percentile (0-0.5). E.g. when |
... |
are further arguments passed to the function that is used to estimate the numerator and denominator models (the function is chosen using |
Within each unit under observation i (usually patients), this function computes inverse probability weights at each time point j during follow-up. These weights are the cumulative product over all previous time points up to j of the ratio of two probabilities:
the numerator contains at each time point the probability of the observed exposure level given observed values of stabilization factors and the observed exposure history up to the time point before j. These probabilities are estimated using the model regressing exposure
on the terms in numerator
, using the link function indicated by family
and link
.
the denominator contains at each time point the probability of the observed exposure level given the observed history of time varying confounders up to j, as well as the stabilization factors in the numerator and the observed exposure history up to the time point before j. These probabilities are estimated using the model regressing exposure
on the terms in denominator
, using the link function indicated by family
and link
.
When the models from which the elements in the numerator and denominator are predicted are correctly specified, and there is no unmeasured confounding, weighting observations ij by the inverse probability weights adjusts for confounding of the effect of the exposure of interest. On the weighted dataset a marginal structural model can then be fitted, quantifying the causal effect of the exposure on the outcome of interest.
With numerator
specified, stabilized weights are computed, otherwise unstabilized weights with a numerator of 1 are computed. With a continuous exposure, using family = "gaussian"
, weights are computed using the ratio of predicted densities at each time point. Therefore, for family = "gaussian"
only stabilized weights can be used, since unstabilized weights would have infinity variance.
A list containing the following elements:
ipw.weights |
vector containing inverse probability weights for each observation. Returned in the same order as the observations in |
weights.trunc |
vector containing truncated inverse probability weights, only returned when |
call |
the original function call. |
selvar |
selection variable. With |
num.mod |
the numerator model, only returned when |
den.mod |
the denominator model. |
Currently, the exposure
variable and the variables used in numerator
and denominator
, id
, tstart
and timevar
should not contain missing values.
Willem M. van der Wal willem@vanderwalresearch.com, Ronald B. Geskus rgeskus@oucru.org
Cole, S.R. & Hernán, M.A. (2008). Constructing inverse probability weights for marginal structural models. American Journal of Epidemiology, 168(6), 656-664. https://pubmed.ncbi.nlm.nih.gov:443/18682488/.
Robins, J.M., Hernán, M.A. & Brumback, B.A. (2000). Marginal structural models and causal inference in epidemiology. Epidemiology, 11, 550-560. https://pubmed.ncbi.nlm.nih.gov/10955408/.
Van der Wal W.M. & Geskus R.B. (2011). ipw: An R Package for Inverse Probability Weighting. Journal of Statistical Software, 43(13), 1-23. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.18637/jss.v043.i13")}.
basdat
, haartdat
, ipwplot
, ipwpoint
, ipwtm
, timedat
, tstartfun
.
########################################################################
#EXAMPLE 1
#Load longitudinal data from HIV positive individuals.
data(haartdat)
#CD4 is confounder for the effect of initiation of HAART therapy on mortality.
#Estimate inverse probability weights to correct for confounding.
#Exposure allocation model is Cox proportional hazards model.
temp <- ipwtm(
exposure = haartind,
family = "survival",
numerator = ~ sex + age,
denominator = ~ sex + age + cd4.sqrt,
id = patient,
tstart = tstart,
timevar = fuptime,
type = "first",
data = haartdat)
#plot inverse probability weights
graphics.off()
ipwplot(weights = temp$ipw.weights, timevar = haartdat$fuptime,
binwidth = 100, ylim = c(-1.5, 1.5), main = "Stabilized inverse probability weights")
#CD4 count has an effect both on dropout and mortality, which causes informative censoring.
#Use inverse probability of censoring weighting to correct for effect of CD4 on dropout.
#Use Cox proportional hazards model for dropout.
temp2 <- ipwtm(
exposure = dropout,
family = "survival",
numerator = ~ sex + age,
denominator = ~ sex + age + cd4.sqrt,
id = patient,
tstart = tstart,
timevar = fuptime,
type = "cens",
data = haartdat)
#plot inverse probability of censoring weights
graphics.off()
ipwplot(weights = temp2$ipw.weights, timevar = haartdat$fuptime,
binwidth = 100, ylim = c(-1.5, 1.5), main = "Stabilized inverse probability of censoring weights")
#MSM for the causal effect of initiation of HAART on mortality.
#Corrected both for confounding and informative censoring.
#With robust standard error obtained using cluster().
require(survival)
summary(coxph(Surv(tstart, fuptime, event) ~ haartind + cluster(patient),
data = haartdat, weights = temp$ipw.weights*temp2$ipw.weights))
#uncorrected model
summary(coxph(Surv(tstart, fuptime, event) ~ haartind, data = haartdat))
########################################################################
#EXAMPLE 2
data(basdat)
data(timedat)
#Aim: to model the causal effect of active tuberculosis (TB) on mortality.
#Longitudinal CD4 is a confounder as well as intermediate for the effect of TB.
#process original measurements
#check for ties (not allowed)
table(duplicated(timedat[,c("id", "fuptime")]))
#take square root of CD4 because of skewness
timedat$cd4.sqrt <- sqrt(timedat$cd4count)
#add TB time to dataframe
timedat <- merge(timedat, basdat[,c("id", "Ttb")], by = "id", all.x = TRUE)
#compute TB status
timedat$tb.lag <- ifelse(with(timedat, !is.na(Ttb) & fuptime > Ttb), 1, 0)
#longitudinal CD4-model
require(nlme)
cd4.lme <- lme(cd4.sqrt ~ fuptime + tb.lag, random = ~ fuptime | id,
data = timedat)
#build new dataset:
#rows corresponding to TB-status switches, and individual end times
times <- sort(unique(c(basdat$Ttb, basdat$Tend)))
startstop <- data.frame(
id = rep(basdat$id, each = length(times)),
fuptime = rep(times, nrow(basdat)))
#add baseline data to dataframe
startstop <- merge(startstop, basdat, by = "id", all.x = TRUE)
#limit individual follow-up using Tend
startstop <- startstop[with(startstop, fuptime <= Tend),]
startstop$tstart <- tstartfun(id, fuptime, startstop) #compute tstart (?tstartfun)
#indicate TB status
startstop$tb <- ifelse(with(startstop, !is.na(Ttb) & fuptime >= Ttb), 1, 0)
#indicate TB status at previous time point
startstop$tb.lag <- ifelse(with(startstop, !is.na(Ttb) & fuptime > Ttb), 1, 0)
#indicate death
startstop$event <- ifelse(with(startstop, !is.na(Tdeath) & fuptime >= Tdeath),
1, 0)
#impute CD4, based on TB status at previous time point.
startstop$cd4.sqrt <- predict(cd4.lme, newdata = data.frame(id = startstop$id,
fuptime = startstop$fuptime, tb.lag = startstop$tb.lag))
#compute inverse probability weights
temp <- ipwtm(
exposure = tb,
family = "survival",
numerator = ~ 1,
denominator = ~ cd4.sqrt,
id = id,
tstart = tstart,
timevar = fuptime,
type = "first",
data = startstop)
summary(temp$ipw.weights)
ipwplot(weights = temp$ipw.weights, timevar = startstop$fuptime, binwidth = 100)
#models
#IPW-fitted MSM, using cluster() to obtain robust standard error estimate
require(survival)
summary(coxph(Surv(tstart, fuptime, event) ~ tb + cluster(id),
data = startstop, weights = temp$ipw.weights))
#unadjusted
summary(coxph(Surv(tstart, fuptime, event) ~ tb, data = startstop))
#adjusted using conditioning: part of the effect of TB is adjusted away
summary(coxph(Surv(tstart, fuptime, event) ~ tb + cd4.sqrt, data = startstop))
## Not run:
#compute bootstrap CI for TB parameter (takes a few hours)
#taking into account the uncertainty introduced by modelling longitudinal CD4
#taking into account the uncertainty introduced by estimating the inverse probability weights
#robust with regard to weights unequal to 1
# require(boot)
# boot.fun <- function(data, index, data.tm){
# data.samp <- data[index,]
# data.samp$id.samp <- 1:nrow(data.samp)
# data.tm.samp <- do.call("rbind", lapply(data.samp$id.samp, function(id.samp) {
# cbind(data.tm[data.tm$id == data.samp$id[data.samp$id.samp == id.samp],],
# id.samp = id.samp)
# }
# ))
# cd4.lme <- lme(cd4.sqrt ~ fuptime + tb.lag, random = ~ fuptime | id.samp, data = data.tm.samp)
# times <- sort(unique(c(data.samp$Ttb, data.samp$Tend)))
# startstop.samp <- data.frame(id.samp = rep(data.samp$id.samp, each = length(times)),
# fuptime = rep(times, nrow(data.samp)))
# startstop.samp <- merge(startstop.samp, data.samp, by = "id.samp", all.x = TRUE)
# startstop.samp <- startstop.samp[with(startstop.samp, fuptime <= Tend),]
# startstop.samp$tstart <- tstartfun(id.samp, fuptime, startstop.samp)
# startstop.samp$tb <- ifelse(with(startstop.samp, !is.na(Ttb) & fuptime >= Ttb), 1, 0)
# startstop.samp$tb.lag <- ifelse(with(startstop.samp, !is.na(Ttb) & fuptime > Ttb), 1, 0)
# startstop.samp$event <- ifelse(with(startstop.samp, !is.na(Tdeath) & fuptime >= Tdeath), 1, 0)
# startstop.samp$cd4.sqrt <- predict(cd4.lme, newdata = data.frame(id.samp =
# startstop.samp$id.samp, fuptime = startstop.samp$fuptime, tb.lag = startstop.samp$tb.lag))
#
# return(coef(coxph(Surv(tstart, fuptime, event) ~ tb, data = startstop.samp,
# weights = ipwtm(
# exposure = tb,
# family = "survival",
# numerator = ~ 1,
# denominator = ~ cd4.sqrt,
# id = id.samp,
# tstart = tstart,
# timevar = fuptime,
# type = "first",
# data = startstop.samp)$ipw.weights))[1])
# }
# bootres <- boot(data = basdat, statistic = boot.fun, R = 999, data.tm = timedat)
# bootres
# boot.ci(bootres, type = "basic")
#
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.