Description Usage Arguments Value Details Examples
Fit sparse variational Bayesian proportional hazards models.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
Y |
Failure times. |
delta |
Censoring indicator, 0: censored, 1: uncensored. |
X |
Design matrix. |
lambda |
Penalisation parameter, default: |
a0 |
Beta distribution parameter, default: |
b0 |
Beta distribution parameter, default: |
mu.init |
Initial value for the mean of the Gaussian component of the variational family (μ), default taken from LASSO fit. |
s.init |
Initial value for the standard deviations of the Gaussian
component of the variational family (s), default:
|
g.init |
Initial value for the inclusion probability (γ),
default: |
maxiter |
Maximum number of iterations, default: |
tol |
Convergence tolerance, default: |
alpha |
The elastic-net mixing parameter used for initialising |
center |
Center X prior to fitting, increases numerical stability,
default: |
verbose |
Print additional information: default: |
Returns a list containing:
beta_hat |
Point estimate for the coefficients β, taken as the mean under the variational approximation. \hat{β}_j = E_{\tilde{Π}} [ β_j ] = γ_j μ_j. |
inclusion_prob |
Posterior inclusion probabilities. Used to describe the posterior probability a coefficient is non-zero. |
m |
Final value for the means of the Gaussian component of the variational family μ. |
s |
Final value for the standard deviation of the Gaussian component of the variational family s. |
g |
Final value for the inclusion probability (γ). |
lambda |
Value of lambda used. |
a0 |
Value of α_0 used. |
b0 |
Value of β_0 used. |
converged |
Describes whether the algorithm converged. |
Rather than compute the posterior using MCMC, we turn to approximating it
using variational inference. Within variational inference we re-cast
Bayesian inference as an optimisation problem, where we minimize the
Kullback-Leibler (KL) divergence between a family of tractable distributions
and the posterior, Π.
In our case we use a mean-field variational
family,
Q = \{ ∏_{j=1}^p γ_j N(μ_j, s_j^2) + (1 - γ_j) δ_0 \}
where μ_j is the mean and s_j the std. dev for the Gaussian
component, γ_j the inclusion probabilities, δ_0 a Dirac mass
at zero and p the number of coefficients.
The components of the
variational family (μ, s, γ) are then optimised by minimizing the
Kullback-Leibler divergence between the variational family and the posterior,
\tilde{Π} = \arg \min KL(Q \| Π).
We use co-ordinate ascent
variational inference (CAVI) to solve the above optimisation problem.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | n <- 125 # number of sample
p <- 250 # number of features
s <- 5 # number of non-zero coefficients
censoring_lvl <- 0.25 # degree of censoring
# generate some test data
set.seed(1)
b <- sample(c(runif(s, -2, 2), rep(0, p-s)))
X <- matrix(rnorm(n * p), nrow=n)
Y <- log(1 - runif(n)) / -exp(X %*% b)
delta <- runif(n) > censoring_lvl # 0: censored, 1: uncensored
Y[!delta] <- Y[!delta] * runif(sum(!delta)) # rescale censored data
# fit the model
f <- survival.svb::svb.fit(Y, delta, X, mu.init=rep(0, p))
## Larger Example
n <- 250 # number of sample
p <- 1000 # number of features
s <- 10 # number of non-zero coefficients
censoring_lvl <- 0.4 # degree of censoring
# generate some test data
set.seed(1)
b <- sample(c(runif(s, -2, 2), rep(0, p-s)))
X <- matrix(rnorm(n * p), nrow=n)
Y <- log(1 - runif(n)) / -exp(X %*% b)
delta <- runif(n) > censoring_lvl # 0: censored, 1: uncensored
Y[!delta] <- Y[!delta] * runif(sum(!delta)) # rescale censored data
# fit the model
f <- survival.svb::svb.fit(Y, delta, X)
# plot the results
plot(b, xlab=expression(beta), main="Coefficient value", pch=8, ylim=c(-2,2))
points(f$beta_hat, pch=20, col=2)
legend("topleft", legend=c(expression(beta), expression(hat(beta))),
pch=c(8, 20), col=c(1, 2))
plot(f$inclusion_prob, main="Inclusion Probabilities", ylab=expression(gamma))
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.