Description Usage Arguments Details Value Examples
This is the main function for functional forest.
1 2 3 4 5 6 7 8 9 10 11 |
formula |
Formula of model fitted |
data |
All the data |
mtry |
Variables tried at each split |
ntree |
Number of trees |
importance |
Whether to calculate PVIM |
npc |
A given number of PCs when smoothing |
m_split |
Optimal tree size |
smooth |
Whether to smooth curves |
The FunFor algorithm is able to predict curve responses for new observations and select important variables from a large set of scalar predictors. see ?optimal_size for how to determine the optimal size of a tree fit. See ?predict.mvRF for how to predict from new observations based on a fitted FunFor model.
imp Importance measures for the candidate predictors
fits A list of the fitted trees. Will be used to make predictions.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 | library(MASS)
nbx = 100
nbobs = 100
T = seq(0, 1, len = 100)
m = 1
rho = 0.6
mu = matrix(1, nbx, 1)
ar1_cor = function(n, m,rho) {
exponent = abs(matrix(1:n - 1, nrow = n, ncol = n, byrow = TRUE) - (1:n - 1))
L = rho^exponent
diag(L) = m
L
}
p = nbx
x_sigma = ar1_cor(p, m, rho)
noise_sigma = ar1_cor(length(T), (5 * cos(T) + rnorm(length(T), 0, 1)) / 10, 0.01)
beta_1 = function(t) sin(20 * pi * T) / 3
X = mvrnorm(nbx, mu, x_sigma)
X = as.data.frame(X)
Y = data.frame(matrix(NA, nrow = nbobs, ncol = length(T)))
for(j in 1:nbobs) Y[j, ] = (X[j, 2] * X[j, 3]) * beta_1(T) + mvrnorm(1, rep(0, length(T)), noise_sigma)
formula = paste("data[, 1:length(T)]", "~", paste(names(X), collapse = "+"))
data = cbind(Y, X)
o_split = optimal_size(formula, data)
funfor_fit = FunFor(formula, data, mtry = 40, ntree = 10, npc = 3, m_split = o_split)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.