Description Usage Arguments Details Value Author(s) References See Also Examples
Perform a k-fold cross-validation for a learning algorithm or a fixed network structure.
1 2 3 |
data |
a data frame containing the variables in the model. |
bn |
either a character string (the label of the learning
algorithm to be applied to the training data in each iteration)
or an object of class |
loss |
a character string, the label of a loss function. If none is specified, the default loss function is the Log-Likelihood Loss for both discrete and continuous data sets. See below for additional details. |
k |
a positive integer number, the number of groups into which the data will be split. |
algorithm.args |
a list of extra arguments to be passed to the learning algorithm. |
loss.args |
a list of extra arguments to be passed to
the loss function specified by |
fit |
a character string, the label of the method used to fit the
parameters of the newtork. See |
fit.args |
additional arguments for the parameter estimation prcoedure,
see again |
cluster |
an optional cluster object from package snow.
See |
debug |
a boolean value. If |
The following loss functions are implemented:
Log-Likelihood Loss (logl
): also known as negative
entropy or negentropy, it's the negated expected log-likelihood
of the test set for the Bayesian network fitted from the training set.
Gaussian Log-Likelihood Loss (logl-g
): the negated expected
log-likelihood for Gaussian Bayesian networks.
Classification Error (pred
): the prediction error
for a single node (specified by the target
parameter in loss.args
)
in a discrete network.
Predictive Correlation (cor
): the correlation
between the observed and the predicted values for a single node
(specified by the target
parameter in loss.args
) in a
Gaussian Bayesian network.
Mean Squared Error (mse
): the mean squared error
between the observed and the predicted values for a single node
(specified by the target
parameter in loss.args
) in a
Gaussian Bayesian network.
An object of class bn.kcv
.
Marco Scutari
Koller D, Friedman N (2009). Probabilistic Graphical Models: Principles and Techniques. MIT Press.
1 2 3 | bn.cv(learning.test, 'hc', loss = "pred", loss.args = list(target = "F"))
bn.cv(gaussian.test, 'mmhc')
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.