plotComparativeResultsBest | R Documentation |
This function generates plots comparing the best performance metrics (such as AUC, accuracy, recall, precision, F1-score, etc.) across multiple methods. It visualizes both empirical performance and cross-validation (CV) scores for different methods, with the option to plot results for different scores. The function can handle both classification and regression tasks.
plotComparativeResultsBest(digested.results, plot = TRUE, ylim = c(0.5, 1))
digested.results |
A list containing the performance results, including both empirical and cross-validation (CV) scores for various methods. |
plot |
A logical value ('TRUE' or 'FALSE'). If 'TRUE', the function will generate and display the plots. Default is 'TRUE'. |
ylim |
A numeric vector of length 2 specifying the limits for the y-axis. Default is 'c(0.5, 1)'. |
The function generates multiple plots comparing the best performance metrics such as AUC, accuracy, recall, precision, F1-score, and correlation, across multiple methods. The plots include: - Empirical performance for each method. - Cross-validation performance (generalization) for each method.
The function can visualize both classification and regression models, and displays the best performance across different methods. The plots are arranged using the 'multiplot' function.
If 'plot = TRUE', the function displays the plots. If 'plot = FALSE', the function returns a list of ggplot objects for further manipulation.
Edi Prifti (IRD)
# Assuming digested.results contains the performance scores for methods
plotComparativeResultsBest(digested.results, plot = TRUE, ylim = c(0.5, 1))
# You can customize the plot by adjusting the score, error bars (ci), and other parameters.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.