AutoScore_testing | R Documentation |
AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)
AutoScore_testing( test_set, final_variables, cut_vec, scoring_table, threshold = "best", with_label = TRUE, metrics_ci = TRUE )
test_set |
A processed |
final_variables |
A vector containing the list of selected variables, selected from Step(ii) |
cut_vec |
Generated from STEP(iii) |
scoring_table |
The final scoring table after fine-tuning, generated from STEP(iv) |
threshold |
Score threshold for the ROC analysis to generate sensitivity, specificity, etc. If set to "best", the optimal threshold will be calculated (Default:"best"). |
with_label |
Set to TRUE if there are labels in the test_set and performance will be evaluated accordingly (Default:TRUE). Set it to "FALSE" if there are not "label" in the "test_set" and the final predicted scores will be the output without performance evaluation. |
metrics_ci |
whether to calculate confidence interval for the metrics of sensitivity, specificity, etc. |
A data frame with predicted score and the outcome for downstream visualization.
Xie F, Chakraborty B, Ong MEH, Goldstein BA, Liu N. AutoScore: A Machine Learning-Based Automatic Clinical Score Generator and Its Application to Mortality Prediction Using Electronic Health Records. JMIR Medical Informatics 2020;8(10):e21798
AutoScore_rank
, AutoScore_parsimony
, AutoScore_weighting
, AutoScore_fine_tuning
, print_roc_performance
, Run vignette("Guide_book", package = "AutoScore")
to see the guidebook or vignette.
## Please see the guidebook or vignettes
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.