Description Usage Arguments Value Author(s) Examples
This function can be used to easily generate input matrices for lvnet based on a lavaan model.
1 |
model |
Lavaan model syntax |
data |
The dataset. Only used to extract order of variables names from the columnnames. |
std.lv |
Should the model be identified by constraining latent variable variance to 1. Defaults to |
lavaanifyOps |
A list with other options sent to |
A list with the model matrices for lambda
, psi
, theta
and beta
Sacha Epskamp <mail@sachaepskamp.com>
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 | ## Not run:
library("lavaan")
# Load dataset:
data(HolzingerSwineford1939)
Data <- HolzingerSwineford1939[,7:15]
# lavaan model
HS.model <- '
visual =~ x1 + x2 + x3
textual =~ x4 + x5 + x6
speed =~ x7 + x8 + x9 '
# fit via lavaan:
lavFit <- cfa(HS.model, HolzingerSwineford1939[7:15],std.lv=TRUE)
# Fit via lvnet:
mod <- lav2lvnet(HS.model, HolzingerSwineford1939[7:15])
lvnetFit <- lvnet(Data, lambda = mod$lambda, psi = mod$psi)
# Compare:
Compare <- data.frame(
lvnet = round(unlist(lvnetFit$fitMeasures)[c("npar","df","chisq","fmin","aic","bic",
"rmsea","cfi","tli","nfi","logl")],3),
lavaan = round(fitMeasures(lavFit)[c("npar","df","chisq","fmin","aic","bic","rmsea",
"cfi","tli","nfi","logl")],3))
Compare
## End(Not run)
|
Loading required package: OpenMx
To take full advantage of multiple cores, use:
mxOption(NULL, 'Number of Threads', parallel::detectCores()) #now
Sys.setenv(OMP_NUM_THREADS=parallel::detectCores()) #before library(OpenMx)
This is lavaan 0.6-3
lavaan is BETA software! Please report any bugs.
Attaching package: 'lavaan'
The following object is masked from 'package:OpenMx':
vech
lvnet lavaan
npar 21.000 21.000
df 24.000 24.000
chisq 85.306 85.306
fmin 0.142 0.142
aic 7517.490 7517.490
bic 7595.339 7595.339
rmsea 0.092 0.092
cfi 0.931 0.931
tli 0.896 0.896
nfi 0.907 0.907
logl -3737.745 -3737.745
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.