Description Usage Arguments Author(s) References See Also Examples
Implements a marginal approach to generalized functional principal components analysis for sparsely observed binary curves
1 2  | 
data | 
 A dataframe containing observed data. Should have column names
  | 
npc | 
 prespecified value for the number of principal components (if
given, this overrides   | 
pve | 
 proportion of variance explained; used to choose the number of principal components.  | 
output_index | 
 Grid on which estimates should be computed. Defaults to
  | 
type | 
 Type of estimate for the FPCs; either   | 
nbasis | 
 Number of basis functions used in spline expansions  | 
gm | 
 Argument passed to score prediction algorithm  | 
Jan Gertheiss jan.gertheiss@agr.uni-goettingen.de
Gertheiss, J., Goldsmith, J., and Staicu, A.-M. (2016). A note on modeling sparse exponential-family functional response curves. Under Review.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73  | ## Not run: 
library(mvtnorm)
library(boot)
library(refund.shiny)
## set simulation design elements
bf = 10                           ## number of bspline fns used in smoothing the cov
D = 101                           ## size of grid for observations
Kp = 2                            ## number of true FPC basis functions
grid = seq(0, 1, length = D)
## sample size and sparsity
I <- 300
mobs <- 7:10
## mean structure
mu <- 8*(grid - 0.4)^2 - 3
## Eigenfunctions /Eigenvalues for cov:
psi.true = matrix(NA, 2, D)
psi.true[1,] = sqrt(2)*cos(2*pi*grid)
psi.true[2,] = sqrt(2)*sin(2*pi*grid)
lambda.true = c(1, 0.5)
## generate data
set.seed(1)
## pca effects: xi_i1 phi1(t)+ xi_i2 phi2(t)
c.true = rmvnorm(I, mean = rep(0, Kp), sigma = diag(lambda.true))
Zi = c.true %*% psi.true
Wi = matrix(rep(mu, I), nrow=I, byrow=T) + Zi
pi.true = inv.logit(Wi)  # inverse logit is defined by g(x)=exp(x)/(1+exp(x))
Yi.obs = matrix(NA, I, D)
for(i in 1:I){
  for(j in 1:D){
    Yi.obs[i,j] = rbinom(1, 1, pi.true[i,j])
  }
}
## "sparsify" data
for (i in 1:I)
{
  mobsi <- sample(mobs, 1)
  obsi <- sample(1:D, mobsi)
  Yi.obs[i,-obsi] <- NA
}
Y.vec = as.vector(t(Yi.obs))
subject <- rep(1:I, rep(D,I))
t.vec = rep(grid, I)
data.sparse = data.frame(
  index = t.vec,
  value = Y.vec,
  id = subject
)
data.sparse = data.sparse[!is.na(data.sparse$value),]
## fit models
## marginal according to Hall et al. (2008)
fit.mar = gfpca_Mar(data = data.sparse, type="approx")
plot(mu)
lines(fit.mar$mu, col=2)
plot_shiny(fit.mar)
## End(Not run)
 | 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.