Compute Confusion Matrix (Image Analyst)

Available with Spatial Analyst license.

Available with Image Analyst license.

Summary

Computes a confusion matrix with errors of omission and commission and derives a kappa index of agreement, Intersection over Union (IoU), and an overall accuracy between the classified map and the reference data.

This tool uses the outputs from the Create Accuracy Assessment Points tool or the Update Accuracy Assessment Points tool.

Usage

  • This tool computes a confusion matrix using the random accuracy assessment points. The accuracy assessment points are generated by the Create Accuracy Assessment Points tool and updated by the Update Accuracy Assessment Points tool. These two tools ensure that each point will have valid class values for the Classified and GrndTruth fields. The Classified and GrndTruth fields are both long integer field types. The tool calculates the user's accuracy and producer's accuracy for each class as well as an overall kappa index of agreement. These accuracy rates range from 0 to 1 in which 1 represents 100 percent accuracy. The following is an example of a confusion matrix:

    c_1c_2c_3TotalU_AccuracyKappa

    c_1

    49

    4

    4

    57

    0.8594

    0

    c_2

    2

    40

    2

    44

    0.9091

    0

    c_3

    3

    3

    59

    65

    0.9077

    0

    Total

    54

    47

    65

    166

    0

    0

    P_Accuracy

    0.9074

    0.8511

    0.9077

    0

    0.8916

    0

    Kappa

    0

    0

    0

    0

    0

    0.8357

    Confusion matrix example

  • User's accuracy shows false positives in which pixels are incorrectly classified as a known class when they should have been classified as something else. An example is when the classified image identifies a pixel as impervious, but the reference identifies it as forest. The impervious class has extra pixels that it should not have according to the reference data.

    User's accuracy is also referred to as errors of commission, or type 1 error. The data to compute this error rate is read from the rows of the table.

    The Total row shows the number of points that should have been identified as a given class according to the reference data.

  • Producer's accuracy is a false negative in which pixels of a known class are classified as something other than that class. An example is when the classified image identifies a pixel as forest, but it should be impervious. In this case, the impervious class is missing pixels according to the reference data.

    Producer's accuracy is also referred to as errors of omission, or type 2 error. The data to compute this error rate is read in the columns of the table.

    The Total column shows the number of points that were identified as a given class according to the classified map.

  • Kappa index of agreement gives an overall assessment of the accuracy of the classification.

  • Intersection over Union (IoU) is the area of overlap between the predicted segmentation and the ground truth divided by the area of union between the predicted segmentation and the ground truth. The mean IoU value is computed for each class.

Parameters

LabelExplanationData Type
Input Accuracy Assessment Points

The accuracy assessment point feature class created from the Create Accuracy Assessment Points tool, containing the Classified and GrndTruth fields. The Classified and GrndTruth fields are both long integer field types.

Feature Layer
Output Confusion Matrix

The output file name of the confusion matrix in table format.

The format of the table is determined by the output location and path. By default, the output will be a geodatabase table. If the path is not in a geodatabase, specify a .dbf extension to save it in dBASE format.

Table

ComputeConfusionMatrix(in_accuracy_assessment_points, out_confusion_matrix)
NameExplanationData Type
in_accuracy_assessment_points

The accuracy assessment point feature class created from the Create Accuracy Assessment Points tool, containing the Classified and GrndTruth fields. The Classified and GrndTruth fields are both long integer field types.

Feature Layer
out_confusion_matrix

The output file name of the confusion matrix in table format.

The format of the table is determined by the output location and path. By default, the output will be a geodatabase table. If the path is not in a geodatabase, specify a .dbf extension to save it in dBASE format.

Table

Code sample

ComputeConfusionMatrix example 1 (stand-alone script)

This example computes the confusion matrix based on accuracy assessment points.

import arcpy
from arcpy.ia import *

# Check out the ArcGIS Image Analyst extension license
arcpy.CheckOutExtension("ImageAnalyst")

accuracy_assessment_points = "c:test\\aapnt2.shp"
confusion_matrix = "c:\\test\\confm.dbf"

ComputeConfusionMatrix(accuracy_assessment_points, confusion_matrix)

Environments

This tool does not use any geoprocessing environments.

Licensing information

  • Basic: Requires Image Analyst or Spatial Analyst
  • Standard: Requires Image Analyst or Spatial Analyst
  • Advanced: Requires Image Analyst or Spatial Analyst

Related topics