## gov.sandia.cognition.learning.performance.categorization Interface BinaryConfusionMatrix

All Superinterfaces:
Cloneable, CloneableSerializable, ConfusionMatrix<Boolean>, Serializable
All Known Implementing Classes:
AbstractBinaryConfusionMatrix, DefaultBinaryConfusionMatrix

`public interface BinaryConfusionMatrixextends ConfusionMatrix<Boolean>`

An interface for a binary confusion matrix. It is defined as a `ConfusionMatrix` over Boolean objects. It treats true as the positive category and false as the negative category.

Since:
3.1
Author:
Justin Basilico

Method Summary
` double` `getFalseNegativesCount()`
Gets the number of false negatives.
` double` `getFalseNegativesFraction()`
Gets the fraction of false negatives.
` double` `getFalsePositivesCount()`
Gets the number of false positives.
` double` `getFalsePositivesFraction()`
Gets the fraction of false positives.
` double` `getFScore()`
The F-score of the confusion matrix, which is also known as the F1-score or F-measure.
` double` `getFScore(double beta)`
The F-score for the confusion matrix with the given trade-off parameter (beta).
` double` `getPrecision()`
The precision value for the confusion matrix.
` double` `getRecall()`
The recall value for the confusion matrix.
` double` `getSensitivity()`
The sensitivity value for the confusion matrix.
` double` `getSpecificity()`
The specificity value for the confusion matrix.
` double` `getTrueNegativesCount()`
Gets the number of true negatives.
` double` `getTrueNegativesFraction()`
Gets the fraction of true negatives.
` double` `getTruePositivesCount()`
Gets the number of true positives.
` double` `getTruePositivesFraction()`
Gets the fraction of true positives.

Methods inherited from interface gov.sandia.cognition.learning.performance.categorization.ConfusionMatrix
`add, add, addAll, clear, getAccuracy, getActualCategories, getActualCount, getAverageCategoryAccuracy, getAverageCategoryErrorRate, getCategories, getCategoryAccuracy, getCategoryErrorRate, getCount, getErrorRate, getPredictedCategories, getPredictedCategories, getPredictedCount, getTotalCorrectCount, getTotalCount, getTotalIncorrectCount, isEmpty`

Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable
`clone`

Method Detail

### getTruePositivesCount

`double getTruePositivesCount()`
Gets the number of true positives. This is the (true, true) entry.

Returns:
The number of true positives.

### getFalsePositivesCount

`double getFalsePositivesCount()`
Gets the number of false positives. This is the (false, true) entry.

Returns:
The number of false positives.

### getTrueNegativesCount

`double getTrueNegativesCount()`
Gets the number of true negatives. This is the (false, false) entry.

Returns:
The number of true negatives.

### getFalseNegativesCount

`double getFalseNegativesCount()`
Gets the number of false negatives. This is the (true, false) entry.

Returns:
The number of false negatives.

### getTruePositivesFraction

`double getTruePositivesFraction()`
Gets the fraction of true positives. This is the (true, true) fraction.

Returns:
The fraction of true positives.

### getFalsePositivesFraction

`double getFalsePositivesFraction()`
Gets the fraction of false positives. This is the (false, true) fraction.

Returns:
The fraction of false positives.

### getTrueNegativesFraction

`double getTrueNegativesFraction()`
Gets the fraction of true negatives. This is the (false, false) fraction.

Returns:
The fraction of true negatives.

### getFalseNegativesFraction

`double getFalseNegativesFraction()`
Gets the fraction of false negatives. This is the (true, false) fraction.

Returns:
The fraction of false negatives.

### getSensitivity

`double getSensitivity()`
The sensitivity value for the confusion matrix. The sensitivity is the number of true positives divided by the number of true positives plus the number of false negatives: TP / (TP + FN). It is equivalent to recall.

Returns:
The sensitivity, which is between 0.0 and 1.0.

### getSpecificity

`double getSpecificity()`
The specificity value for the confusion matrix. The specificity is the number of true negatives divided by the number of true negatives plus the number of false positives: TN / (TN + FP).

Returns:
The specificity value, which is between 0.0 and 1.0.

### getPrecision

`double getPrecision()`
The precision value for the confusion matrix. The precision is the number of true positives divided by the number of true positives plus the number of false positives: TP / (TP + FP).

Returns:
The precision value, which is between 0.0 and 1.0.

### getRecall

`double getRecall()`
The recall value for the confusion matrix. The recall is the number of true positives divided by the number of true positives plus the number of false negatives: TP / (TP + FN). It is equivalent to sensitivity.

Returns:
The recall value, which is be between 0.0 and 1.0.

### getFScore

`double getFScore()`
The F-score of the confusion matrix, which is also known as the F1-score or F-measure. It is calculated as: 2 * (precision * recall) / (precision + recall) It is equivalent to the F-score with beta = 1.

Returns:
The F-score, which is between 0.0 and 1.0.

### getFScore

`double getFScore(double beta)`
The F-score for the confusion matrix with the given trade-off parameter (beta). It is calculated as: (1 + beta^2) * (precision * recall) / ((beta^2 * precision) + recall)

Parameters:
`beta` - The beta value of the score. It is the importance assigned to precision as compared to recall.
Returns:
The F-score for the matrix, which is greater than zero.