Selective Bayesian Forest Classifier (SBFC) algorithm
Runs the SBFC algorithm on a discretized data set. To discretize your data, use the
Discretized data set:
Number of MCMC steps, default max(10000, 10 * ncol(TrainX)).
Thinning factor for the MCMC.
Denominator of the fraction of total MCMC steps discarded as burnin (default=5).
Do cross-validation on the training set (if test set is not provided).
Return thinned MCMC outputs (parents, groups, trees, logposterior), rather than all outputs (default=FALSE).
Data needs to be discretized before running SBFC.
If the test data matrix TestX is provided, SBFC runs on the entire training set TrainX, and provides predicted class labels for the test data. If the test data class vector TestY is provided, the accuracy is computed. If the test data matrix TestX is not provided, and cv is set to TRUE, SBFC performs cross-validation on the training data set TrainX, and returns predicted classes and accuracy for the training data.
An object of class
Classification accuracy (on the test set if provided, otherwise cross-validation accuracy on training set).
Vector of class label predictions (for the test set if provided, otherwise for the training set).
Matrix of class label probabilities (for the test set if provided, otherwise for the training set).
Total runtime of the algorithm in seconds.
Matrix representing the structures sampled by MCMC, where parents[i,j] is the index of the parent of node j at iteration i (0 if node is a root).
Matrix representing the structures sampled by MCMC, where groups[i,j] indicates which group node j belongs to at iteration j (0 is noise, 1 is signal).
Matrix representing the structures sampled by MCMC, where trees[i,j] indicates which tree node j belongs to at iteration j.
Vector representing the log posterior at each iteration of the MCMC.
cv=TRUE, the MCMC samples from the first fold are returned (
1 2 3 4
Want to suggest features or report bugs for rdrr.io? Use the GitHub issue tracker.