Package name

CONFUSION-MATRIX

Nicknames

CM

Package documentation

Lisp library for creating a confusion matrix, incrementally adding information, and retrieving statistical information.

Functions

COHEN-KAPPA
Lambda list

cohen-kappa ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Cohen’s Kappa statistic compares observed accuracy with an expected accuracy

CONFUSION-MATRIX-ADD
Lambda list

confusion-matrix-add ( cm predicted observed &optional (total 1) )

Documentation

Adds total to the count for given (predicted observed) labels, errors if labels not known

CONFUSION-MATRIX-COUNT
Lambda list

confusion-matrix-count ( cm predicted observed )

Documentation

Returns the count for given (predicted observed) labels, errors if labels not known

CONFUSION-MATRIX-P
Lambda list

confusion-matrix-p ( object )

Documentation

NIL

CONFUSION-MATRIX-TOTAL
Lambda list

confusion-matrix-total ( cm )

Documentation

Returns the total number of instances referenced in the confusion matrix

F-MEASURE
Lambda list

f-measure ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Harmonic mean of the precision and recall for the given label

FALSE-NEGATIVE
Lambda list

false-negative ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Returns the number of instances of the given label which are incorrectly observed

FALSE-POSITIVE
Lambda list

false-positive ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Returns the number of instances incorrectly observed of the given label

FALSE-RATE
Lambda list

false-rate ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Proportion of instances of label incorrectly observed out of all instances not originally of that label

GEOMETRIC-MEAN
Lambda list

geometric-mean ( cm )

Documentation

nth root of product of true-rate for each label

MAKE-CONFUSION-MATRIX
Lambda list

make-confusion-matrix ( &key (labels '(positive negative)) (counts (mapcar #'(lambda (pair) (cons pair 0)) (map-product 'list labels labels))) )

Documentation

NIL

MATTHEWS-CORRELATION
Lambda list

matthews-correlation ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Measure of the quality of binary classifications

OVERALL-ACCURACY
Lambda list

overall-accuracy ( cm )

Documentation

Proportion of instances which are correctly labelled

PRECISION
Lambda list

precision ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Proportion of instances of given label which are correct

PREVALENCE
Lambda list

prevalence ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Proportion of instances observed of given label, out of total

RECALL
Lambda list

recall ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Recall is equal to the true rate, for a given label

SENSITIVITY
Lambda list

sensitivity ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Sensitivity is another name for the true positve rate (recall), for a given label

SPECIFICITY
Lambda list

specificity ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Specificity is 1 - false-rate, for a given label

TRUE-NEGATIVE
Lambda list

true-negative ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Returns the number of instances NOT of the given label which are correctly observed

TRUE-POSITIVE
Lambda list

true-positive ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Returns the number of instances of the given label correctly observed

TRUE-RATE
Lambda list

true-rate ( cm &key (for-label (first (confusion-matrix-labels cm))) )

Documentation

Proportion of instances of given label which are correctly observed