package biocaml
Install
    
    dune-project
 Dependency
Authors
Maintainers
Sources
md5=4cf944bcae5d36bf47b67f6bcb2455d7
    
    
  sha512=0262b5768aefd7c080f664c46c88876fce9a658cc6a87358a77b7112c49ae3042e7ab542e76be5738fbaeda853149b308b48d4897960b5c7ae3b4da71d978bd8
    
    
  doc/biocaml.unix/Biocaml_unix/Bin_pred/index.html
Module Biocaml_unix.Bin_pred
Performance measurement of binary classifiers.
This module provides functions to compute various performance measurements of a binary classifier's prediction. Typically, binary classifiers output both a label and a score indicating a confidence level. A ROC curve represents the variation of sensitivity and specificity of the classifier as a function of a score threshold.
val confusion_matrix : 
  scores:float array ->
  labels:bool array ->
  threshold:float ->
  confusion_matrixconfusion_matrix ~scores ~labels ~threshold computes a confusion matrix from the classifier scores and example labels, based on a threshold. It assumes that example i has score scores.(i) and label labels.(i), that scores and labels have the same length and that a higher score means increased probability of a true label.
val positive : confusion_matrix -> intval negative : confusion_matrix -> intval cardinal : confusion_matrix -> intval sensitivity : confusion_matrix -> floatval recall : confusion_matrix -> floatsame as sensitivity
val false_positive_rate : confusion_matrix -> floatval accuracy : confusion_matrix -> floatval specificity : confusion_matrix -> floatval positive_predictive_value : confusion_matrix -> floatval precision : confusion_matrix -> floatsame as positive_predictive_value
val negative_predictive_value : confusion_matrix -> floatval false_discovery_rate : confusion_matrix -> floatval f1_score : confusion_matrix -> floatval performance_curve : 
  scores:float array ->
  labels:bool array ->
  (float * confusion_matrix) arrayperformance_curve ~scores ~labels returns the series of confusion matrices obtained by varying the threshold from infinity to neg_infinity. Each confusion matrix comes with the corresponding threshold.
roc_curve ~scores ~labels returns the ROC curve of the prediction, and the associated Area Under Curve (AUC)
val recall_precision_curve : 
  scores:float array ->
  labels:bool array ->
  (float * float) array * floatrecall_precision_curve ~scores ~labels returns the RP curve of the prediction, and the associated average precision