iHaz: Estimation of montone hazard and survival functions

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

The main function for estimating the hazard function (and consequently the survival function) for left truncated interval censored data under the assumption of a monotone hazard function. The inputs for this function are baseline times (left truncation time) and the two time points for interval censoring. The censoring interval can also be (0,b) or (a,Inf) to denote left and right censoring, respectively. Similarly, setting the truncation time as zero allows us to specialize to the case of censored data without left truncation.

Usage

1
2
3
iHaz(data, ini.index = 1:3, inter.maxiter = 1000, inter.tol = 1e-04,
    main.maxiter = 1000, main.tol = 0.001,
    check.condition = c("derv", "KKT"), verbose = FALSE)

Arguments

data

A data frame or list with the following components: t basleine/left truncation times, a and b the endpoints for interval censoring with a < x < b where x is the true (unobserved) event time.

ini.index

The initial index set for the support reudction algorithm. See details below.

inter.maxiter

The maximum number of iterations for secondary/intermediate algorithm for optimization over the reduced support set. See details below.

inter.tol

The tolerance to determine the stopping condition for the intermediate algorithm. See details below.

main.maxiter

The maximum number of iterations for the main loop of the algorithm. See details below.

main.tol

The tolerance to determine the stopping condition for the main loop. See details below.

check.condition

A string to specify the type of convergence criteria. The possible options are "KKT" which uses the Karush-Kuhn-Tucker conditions, or "derv" which uses a derivative condition as in Lemma 3.1 of Wellner et al. (1997).

verbose

Logical indicator to specify whether or not details should be printed to the screen while running the algorithm.

Details

The estimated hazard function via the projection algorithm of Pan et al. (1998) is a step function with a potential change point at every time point in our dataset. In practice however, the estimated hazard has only a small number of change points. This naturally leads us to a support reduction type algorithm [Groeneboom et al. (2008)]. We define z = sort(unique(c(a,b,t))) as all the possible time points. Our estimated hazard function is a step function with potential change points at each element of z. For then algorithm of Pan et al. (1998) this gives us a parameter vector lambda of length K = length(z). In practice however, the total number of change points for the estimated step function is much smaller than K.

Our algorithm proceeds as follows:

We begin with ini.index, the initial indices which are a subset of the vector 1:K. For comupational reasons we must have 1 in the initial index set. We then optimizate the objective function over the class of step functions with change points given by ini.index.This is done via the projection algorithm of Pan et al. (1998). inter.maxiter and inter.tol specify the maximum number of iterations and tolerance for checking stopping conditions for this projection algorithm, respectively.

Once the function is optimized for the reduced support, we check if the some optimization condition is met upto the main.tol accuracy. If the condition is not met, support points are added and the index set is updated. This continues until convergence or until main.maxiter iterations.

Value

An object of class "iHaz" whcih is a list with the following components:

hazard

The estimated hazard function. This is an R, vector of class "stepfun".

survival

The etimated survival function. This function also takes as input, a vector at which we wish to evaluate the estimated survival probability.

index

The index set of our estimates. This corresponds to the indices of z; it is the final updated index set of our algorithm.

a, b, t

The values of a, b and t used in estimation of survival and hazard functions.

conv

A logical indicator of convergence status.

Author(s)

Asad Haris, Gary Chan

References

Pan, Wei, and R. Chappell. "Estimating survival curves with left-truncated and interval-censored data under monotone hazards." Biometrics (1998): 1053-1060.

Piet Groeneboom, Geurt Jongbloed, and Jon A. Wellner. "The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models." Scandinavian Journal of Statistics 35.3 (2008): 385-399.

Jon A. Wellner, and Yihui Zhan. "A hybrid algorithm for computation of the nonparametric maximum likelihood estimator from censored data." Journal of the American Statistical Association 92.439 (1997): 945-959.

See Also

plot.iHaz

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
library(iHaz)
#Generate some data
#Here event time is distributed Exponential(1)
#Simulation study from Pan et al. (1998)
n<- 500
x<- rexp(n)
t<- runif(n, min = 0, max = 1.5)
xnew<- x[(x>=t)]
tnew<- t[x>=t]
t<- tnew
a<- xnew
b<- xnew
a[xnew<= tnew+0.5]<- tnew[xnew<= tnew+0.5]
b[xnew<= tnew+0.5]<- tnew[xnew<= tnew+0.5] +0.5
a[xnew > tnew+0.5]<- tnew[xnew > tnew+0.5] +0.5
b[xnew > tnew+0.5]<- Inf
dat<- list("a" = a, "b" = b, "t" = t)

#Fit an 'iHaz' object
fit<- iHaz(dat, ini.index = 1:3 ,verbose =TRUE)

#Veiw/plot extimated hazard function
fit$hazard
plot(fit$hazard, main = "Hazard Function")

#View the survival function
fit$survival
#Estimated survival probabilitcies at some time points
fit$survival(c(0.5,0.8,1,1.5))

#Plot iHaz object
plot(fit, col = "red", type = "o", lwd = 1, pch = 16, cex = 0.5)

asadharis/iHaz documentation built on May 12, 2019, 4:32 a.m.