package biocaml
Install
dune-project
Dependency
Authors
Maintainers
Sources
md5=486aeb3e552dabae85839e2af30d6c52
sha512=4ed2df0b7cbd80bd6e29bd8fee9d2dacd9379ad0f4ff142bd8e16ade3f1507f6cc7cbe4c614943b8feb8fa4705935695cb458606b0da813dbf255b1e566a43cf
doc/biocaml.unix/Biocaml_unix/Bin_pred/index.html
Module Biocaml_unix.Bin_predSource
Performance measurement of binary classifiers.
This module provides functions to compute various performance measurements of a binary classifier's prediction. Typically, binary classifiers output both a label and a score indicating a confidence level. A ROC curve represents the variation of sensitivity and specificity of the classifier as a function of a score threshold.
val confusion_matrix :
scores:float array ->
labels:bool array ->
threshold:float ->
confusion_matrixconfusion_matrix ~scores ~labels ~threshold computes a confusion matrix from the classifier scores and example labels, based on a threshold. It assumes that example i has score scores.(i) and label labels.(i), that scores and labels have the same length and that a higher score means increased probability of a true label.
same as sensitivity
same as positive_predictive_value
val performance_curve :
scores:float array ->
labels:bool array ->
(float * confusion_matrix) arrayperformance_curve ~scores ~labels returns the series of confusion matrices obtained by varying the threshold from infinity to neg_infinity. Each confusion matrix comes with the corresponding threshold.
roc_curve ~scores ~labels returns the ROC curve of the prediction, and the associated Area Under Curve (AUC)