Package name |
CONFUSION-MATRIX |
Nicknames |
CM |
Package documentation
Lisp library for creating a confusion matrix, incrementally adding information, and retrieving statistical information.
Functions
- Lambda list
-
cohen-kappa ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Cohen’s Kappa statistic compares observed accuracy with an expected accuracy
- Lambda list
-
confusion-matrix-add ( cm predicted observed &optional (total 1) )
- Documentation
-
Adds total to the count for given (predicted observed) labels, errors if labels not known
- Lambda list
-
confusion-matrix-count ( cm predicted observed )
- Documentation
-
Returns the count for given (predicted observed) labels, errors if labels not known
- Lambda list
-
confusion-matrix-p ( object )
- Documentation
-
NIL
- Lambda list
-
confusion-matrix-total ( cm )
- Documentation
-
Returns the total number of instances referenced in the confusion matrix
- Lambda list
-
f-measure ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Harmonic mean of the precision and recall for the given label
- Lambda list
-
false-negative ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Returns the number of instances of the given label which are incorrectly observed
- Lambda list
-
false-positive ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Returns the number of instances incorrectly observed of the given label
- Lambda list
-
false-rate ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Proportion of instances of label incorrectly observed out of all instances not originally of that label
- Lambda list
-
geometric-mean ( cm )
- Documentation
-
nth root of product of true-rate for each label
- Lambda list
-
make-confusion-matrix ( &key (labels '(positive negative)) (counts (mapcar #'(lambda (pair) (cons pair 0)) (map-product 'list labels labels))) )
- Documentation
-
NIL
- Lambda list
-
matthews-correlation ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Measure of the quality of binary classifications
- Lambda list
-
overall-accuracy ( cm )
- Documentation
-
Proportion of instances which are correctly labelled
- Lambda list
-
precision ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Proportion of instances of given label which are correct
- Lambda list
-
prevalence ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Proportion of instances observed of given label, out of total
- Lambda list
-
recall ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Recall is equal to the true rate, for a given label
- Lambda list
-
sensitivity ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Sensitivity is another name for the true positve rate (recall), for a given label
- Lambda list
-
specificity ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Specificity is 1 - false-rate, for a given label
- Lambda list
-
true-negative ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Returns the number of instances NOT of the given label which are correctly observed
- Lambda list
-
true-positive ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Returns the number of instances of the given label correctly observed
- Lambda list
-
true-rate ( cm &key (for-label (first (confusion-matrix-labels cm))) )
- Documentation
-
Proportion of instances of given label which are correctly observed