PIPs_by_landmarking: Posterior inclusion probabilities (PIPs) by landmarking

Description Usage Arguments Value Author(s) Examples

Description

This function gives us the PIPs for each landmark.

Usage

1
2
3
PIPs_by_landmarking(fullModel, data, discreteSurv = TRUE, numberCores = 1,
  package = "nnet", maxit = 150, prior = "flat", method = "LEB",
  landmarkLength = 1, lastlandmark, timeVariableName)

Arguments

fullModel

formula of the model including all potential variables

data

the data frame with all the information

discreteSurv

Boolean variable telling us whether a 'simple' multinomial regression is looked for or if the goal is a discrete survival-time model for multiple modes of failure is needed.

numberCores

How many cores should be used in parallel?

package

Which package should be used to fit the models; by default the nnet package is used; we could also specify to use the package 'VGAM'

maxit

Only needs to be specified with package nnet: maximal number of iterations

prior

Prior on the model space

method

Method for the g definition

landmarkLength

Length of the landmark, by default we use each day

lastlandmark

Where will be the last landmark?

timeVariableName

What is the name of the variable indicating time?

Value

a list with the PIPs for each landmark

Author(s)

Rachel Heyard

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
# extract the data:
data("VAP_data")

# the definition of the full model with three potential predictors:
FULL <- outcome ~ ns(day, df = 4) + gender + type + SOFA
# here we define time as a spline with 3 knots

PIPs_landmark <- PIPs_by_landmarking(fullModel = FULL, data = VAP_data,
                                     discreteSurv = TRUE, numberCores = 1,
                                     package = 'nnet', maxit = 150,
                                     prior = 'flat',  method = 'LEB',
                                     landmarkLength = 7, lastlandmark = 21,
                                     timeVariableName = 'day')

TBFmultinomial documentation built on May 2, 2019, 2:11 p.m.