Available with Data Reviewer license.
Reviewer rules enable you to detect features that do not comply with established data quality requirements defined by your organization. This includes the detection of errors that affect a feature's attribution, geometric integrity, or relationship with other features. These rules can be used to assess a feature's quality during different phases of the data production workflow.
Automated review capabilities
The first step when designing a Reviewer rule is to understand the capabilities of automated checks in Data Reviewer. The following are examples of check types used to assess different aspects of a feature's quality.
- Spatial relationship checks—Analyze the spatial relationships between features. You can identify whether features overlap, touch, or intersect within a specified distance of each other. For example, you may want to verify that a road does not cross into the ocean or that a fire hydrant is connected to a water lateral.
- Attribute checks—Analyze the attribute values of features and tables. This can include simple field value validations, such as those constrained using a geodatabase domain or have more complex attribute dependencies. For many features, one attribute value is dependent on another of the same feature. For example, if a road is under construction, it may not be accessible. An attribute check can be configured to monitor the status and accessibility of the roads.
- Feature integrity checks—Analyze the properties of features. Not all features in a database follow the same capture criteria. For example, you may have collection rules that define how close two vertices can be or whether multipart features are allowed in your data. Feature integrity checks ensure that collection rules are followed for each data source.
Identification of Data Reviewer checks
The identification of automated Reviewer checks is often a task performed by a subject matter expert with deep knowledge of the data model and quality requirements. A subject matter expert can quickly identify a variety of issues, such as feature integrity, attribute completeness, and spatial relationships.
Using a requirements traceability matrix, the subject matter expert can associate Data Reviewer checks with data quality requirements.
The following table is an example of a populated requirements traceability matrix that references some of the automated review capabilities described above.
This list can provide organizations with a quick reference for identifying a specific capability in a product and its use when collecting requirements.
ID | Requirement | Requirement number | Requirement category | Software | Product capability | Data Reviewer check |
---|---|---|---|---|---|---|
5 | Ability to ensure that source data will be migrated into the production database and have appropriate domains and relationships | D002 | Data Requirement—Logical consistency | Data Reviewer | Domain check | Yes |
Relationship check | ||||||
Subtype check | ||||||
7 | Ability to ensure that production data is for mobile collectors and is attribute accurate | D004 | Data Requirement—Thematic accuracy | Data Reviewer | Regular Expression check | Yes |
Table to Table Attribute check | ||||||
8 | Ability to ensure that there is no overlap between event measures during the project period of 2010–2020 | D005 | Data Requirement—Temporal quality | Data Reviewer | Invalid Events check | Yes |
10 | Ability to identify the number of cells that are not populated (NULL) for each required attribute field | D006 | Data Requirement—Thematic accuracy | Data Reviewer | Query Attributes check | Yes |
11 | Ability to identify parcels that have no overlaying building footprint features | D007 | Data Requirement—Logical consistency | Data Reviewer | Feature on Feature check | Yes |
13 | Ability to validate a unique ID attribute linking a parcel to matching building footprint features | D008 | Data Requirement—Logical consistency | Data Reviewer | Feature on Feature check | Yes |
Automated review in workflows
Reviewer rules enable you to assess quality across multiple phases in the data life cycle. This includes assessment of a feature during initial creation, maintenance, updates, sharing, and archiving or deletion. Further, data currently in use within the organization can be assessed to identify compliance with changing data quality requirements that emerge in response to new business processes.
Learn more about methods to implement automated review
Detect errors in existing data
Automated checks can be used to assess the overall quality of your data based on your organization's unique quality requirements. This can include the review of all features in a dataset to establish a baseline understanding of the data's fitness for use, as well as a subset of features as a step in a workflow. Errors detected using this form of review are stored in your geodatabase to support corrective workflows and quality reporting.
Prevent errors in editing workflows
Automated checks can also be implemented to assess quality when creating or modifying features in your geodatabase. This form of validation serves to enforce data integrity like other forms of geodatabase constraints, such as domains and subtypes. Edits that result in data that does not comply with the organization's data quality requirements are rejected and will not be saved.