A confusion matrix is a performance measurement for machine learning classification problems. It is a table that compares the actual and predicted classifications.
Key Components
- True Positives (TP): The number of actual positive instances that are correctly predicted.
- False Positives (FP): The number of actual negative instances incorrectly labeled as positive.
- True Negatives (TN): The number of actual negative instances correctly predicted.
- False Negatives (FN): The number of actual positive instances incorrectly labeled as negative.
Example
Here's a simple confusion matrix:
Predicted Positive | Predicted Negative | |
---|---|---|
Actual Positive | TP | FN |
Actual Negative | FP | TN |
Visual Representation
For a more detailed understanding, you can refer to this Confusion Matrix Visualization.
Confusion Matrix Example
More Information
To learn more about AI and machine learning, visit our AI Tutorial.