Package name

CONFUSION-MATRIX

Nicknames

CM

Package documentation

Lisp library for creating a confusion matrix, incrementally adding information, and retrieving statistical information.

Functions

COHEN-KAPPA
Lambda list

cohen-kappa ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns Cohen’s Kappa statistic, which compares observed accuracy with an expected accuracy.

CONFUSION-MATRIX-ADD
Lambda list

confusion-matrix-add ( cm predicted observed &optional total )

Documentation
  • cm - confusion matrix

  • predicted - predicted label of instance

  • observed - observed label of instance

  • total - number of instances to add

Adds total to the count for given (predicted observed) labels.

Error

if labels are not part of the confusion matrix.

CONFUSION-MATRIX-COUNT
Lambda list

confusion-matrix-count ( cm predicted observed )

Documentation
  • cm - confusion matrix

  • predicted - predicted label of instance

  • observed - observed label of instance

Returns the count for given (predicted observed) labels.

Error

if labels are not part of the confusion matrix.

CONFUSION-MATRIX-P
Lambda list

confusion-matrix-p ( object )

Documentation

NIL

CONFUSION-MATRIX-TOTAL
Lambda list

confusion-matrix-total ( cm )

Documentation
  • cm - confusion matrix

Returns the total number of instances referenced in the confusion matrix.

F-MEASURE
Lambda list

f-measure ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns harmonic mean of the precision and recall for the given label.

FALSE-NEGATIVE
Lambda list

false-negative ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns the number of instances of the given label which are incorrectly observed.

FALSE-POSITIVE
Lambda list

false-positive ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns the number of incorrectly observed instances of the given label.

FALSE-RATE
Lambda list

false-rate ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns proportion of instances of label incorrectly observed out of all instances not originally of that label.

GEOMETRIC-MEAN
Lambda list

geometric-mean ( cm )

Documentation
  • cm - confusion matrix

Returns nth root of product of TRUE-RATE for each label.

MAKE-CONFUSION-MATRIX
Lambda list

make-confusion-matrix ( &key labels counts )

Documentation

NIL

MATTHEWS-CORRELATION
Lambda list

matthews-correlation ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns measure of the quality of binary classifications.

OVERALL-ACCURACY
Lambda list

overall-accuracy ( cm )

Documentation
  • cm - confusion matrix

Returns proportion of instances which are correctly labelled.

PRECISION
Lambda list

precision ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns proportion of instances of given label which are correct.

PREVALENCE
Lambda list

prevalence ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns proportion of instances observed of given label, out of total.

RECALL
Lambda list

recall ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns recall value, which is equal to the TRUE-RATE, for a given label.

SENSITIVITY
Lambda list

sensitivity ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns sensitivity value, which is another name for the TRUE-RATE (recall), for a given label.

SPECIFICITY
Lambda list

specificity ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns specificity, which is 1 - FALSE-RATE, for a given label.

TRUE-NEGATIVE
Lambda list

true-negative ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns the number of instances NOT of the given label which are correctly observed.

TRUE-POSITIVE
Lambda list

true-positive ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns the number of instances of the given label correctly observed.

TRUE-RATE
Lambda list

true-rate ( cm &key for-label )

Documentation
  • cm - confusion matrix

  • for-label - class label to use as 'positive' - defaults to first label in definition

Returns proportion of instances of given label which are correctly observed.