Available with Standard or Advanced license.
Available for an ArcGIS organization with the ArcGIS Reality license.
The data requirements and workflow below explain how to set up a Reality mapping workspace using satellite imagery.
General data requirements
The following are general data requirements for images to be processed using ArcGIS Reality for ArcGIS Pro:
- Two or more images—The minimum requirement is two highly overlapping single images (not collected as a stereo pair), one stereo pair, or a tri-stereo pair. For increased accuracy, quality, and redundancy, using more images is recommended.
- Cloud free—The images to be processed are free of clouds in the project area.
- Highly overlapping images—The project area must be fully covered with overlapped imagery. For multisensor datasets, each sensor data type must fully cover the project area.
- Not orthorectified—ArcGIS Reality for ArcGIS Pro does not work with orthorectified images. The following are examples of vendor-specific imagery products suitable for orthorectification using ArcGIS Reality for ArcGIS Pro:
- Maxar—View Ready (Standard) OR2A mono, or View Ready Stereo (Standard) OR2A product
- Airbus—Primary mono, Primary stereo, or Primary tri-stereo product
- Associated Rational Polynomial Coefficient (RPC) file—The RPC file is a text file delivered with the satellite imagery by the provider. The RPC file is required to support the geometric correction of satellite images.
Note:
Depending on the satellite imagery vendor, the file may be called an RPC or RPB file, and may sometimes have the .txt extension. They all refer to the same type of information: an abstraction of the satellite camera model.
- Single-band or multiband data—Includes single band (panchromatic), 3 bands (RGB), multispectral, bands pansharpened or 4 bands (RGB,NiR) multispectral + Panchromatic to support pansharpening in ArcGIS Reality for ArcGIS Pro.
- 8 bit or 16-bit imagery.
- High sun elevation angle—This minimizes the effects of shadow in the derived products. A sun elevation angle of 60 degrees or higher is recommended.
- Spatial referencing—Supported options include geographic (WGS84) or WGS84 UTM. or NAD83 UTM.
- Elevation source—This information provides an initial height reference for computing the block adjustment. This height reference can be derived from a digital elevation model (DEM) or the image metadata, or you can specify an average ground elevation or z-value.
Note:
If you need to perform an RPC adjustment using a DEM as the elevation source, it is recommended that you use a local DEM that has an EGM96 or WGS84 vertical coordinate system (VCS). Use the Project Raster tool to re-project the DEM if it has a VCS that is not EGM96 or WGS84.
- Optionally, download the Global DTM of the Earth locally, which provides access to a global DEM that can be used to support this process. Once installed, this global DEM will replace the ArcGIS World Elevation service as the default DEM when processing satellite imagery using a Reality mapping workspace.
Multisensor data requirements
Additional requirements for working with multisensor satellite imagery are listed below.
- Images to be processed must be from the same family of satellite sensors. For example, Maxar’s Worldview-3, Worldview-2 and GeoEye-1 can be combined in a single project.
- Images must have the same spatial referencing.
- Images must have the same number of bands.
- Images must have the same bit depth.
Mesh generation data requirements
In addition to the above requirements, the following requirements are recommended for mesh generation:
- Sensor angles—It is recommended that the acquired images have varying incidence angles. This includes images having incidence angle close to nadir (0-5 degrees) and oblique (up to 20-28 degrees). This ensures good coverage of building facades. For optimal results, the convergence angle (the 3D angle between the incidence angles) of the stereo pairs should be close to 9 degrees.
- Target azimuth angle—A wide target azimuth range covering all look directions around the project area is required for good mesh texturing. See the target azimuth example below.
Note:
Target azimuth angle refers to the angle in degrees from the target to the sensor. The angle range is from 0 degrees to 360 degrees in a clockwise direction.
Create a Reality mapping workspace
To create a Reality mapping workspace using satellite imagery, complete the following steps:
- On the Imagery tab, click New Workspace.
- On the Workspace Configuration page, provide a name for the workspace.
- Ensure that Workspace Type is set to Reality Mapping.
- From the Sensor Data Type drop-down list, choose Satellite.
- Optionally, from the Basemap drop-down list, choose a basemap as a backdrop for the image collection.
- Optionally, set the Parallel Processing Factor value of the workspace. The default value of 50% means that half of the total CPU cores will be used to support Reality mapping processing.
- Optionally, check the Track adjustment restore points check box to be able to revert your workspace to a previous state.
- Optionally, check the Import and use existing image collection check box to import and use an existing mosaic dataset. See Create a Reality mapping workspace from a mosaic dataset for more information.
- Click Next.
- On the Image Collection page, from the Sensor Type drop-down menu, choose an appropriate sensor type from the list of satellite sensors.
- For Folder Containing Images, click the Browse button and browse to the folder on disk containing the imagery, select it, and click OK.
The supported Workspace Spatial Reference information is automatically populated, and the option to add the spatial reference information manually is unavailable. If the system cannot automatically determine the appropriate spatial referencing, manual entry of this information is enabled.
- If the appropriate spatial referencing was not set automatically, for Spatial Reference, click the Browse button
and choose a map reference system, vertical coordinate system, and set the Current XY and Current Z coordinates for the project.
When processing satellite imagery in Reality mapping, the planimetric (x,y) coordinate system must be defined using the WGS84 UTM reference frame, and the vertical coordinate system must be WGS84.
- Click OK to close the Spatial Reference window.
- Click Next.
- If you're working with multisensor satellite imagery, click the Add Sensor button to add more sensor types.
- Repeat steps 10 through 12 to define the additional sensor parameters.
- Click Next.
The Data Loader Options page appears.
- On the Data Loader Options page, specify an Elevation Source value, or use the default DEM.
- If you use a DEM as the Elevation Source value, set the Geoid Correction value.
If the local DEM has ellipsoidal height, select None from the Geoid Correction drop-down list. If the DEM has orthometric height, select EGM96.
- For Processing Template, choose the appropriate processing template based on your project requirements.
To generate a digital surface model (DSM) or digital terrain model (DTM), choose the Panchromatic template. To generate a true ortho or DSM mesh, choose one of the following templates:
- Multispectral—Use if the data being processed is already pansharpened, or the imagery is comprised of multispectral and panchromatic data, but a multispectral product is required.
- Panchromatic—Use when processing panchromatic data.
- Pansharpened—Use if the data being processed is comprised of both multispectral imagery with associated panchromatic data.
Note:
Pansharpening is an image fusion process that combines a high-resolution panchromatic image with a lower-resolution multispectral image to create a high-resolution multispectral image.
- All Bands—Use when adding reflectance satellite imagery data for DSM, true ortho. or mesh generation.
- Multispectral Acomp—Use if the data being processed is comprised of multispectral reflectance data that will not be used for DSM, true ortho, or mesh creation.
- Panchromatic Acomp—Use when processing panchromatic reflectance data that will not be used to support Reality mapping-derived product generation.
- Pansharpened Acomp—Use if the data being processed is comprised of pansharpened, reflectance data that will not be used for DSM, true ortho, or mesh creation.
Note:
The workspace creation process will fail if an incompatible processing template is used to add the imagery. It is recommended that you review the image metadata file to determine the imagery product type, which is used to determine the processing template selection process.
- Expand the Advanced Options section.
The available options will vary based on the processing template selected. For example, if an atmospheric compensation (Acomp)-type template is selected, the Stretch and Gamma options will not be available but will be available for all other Processing Template types.
- Expand Gamma.
- For Gamma Stretch, select User Defined from the drop-down menu, and provide an appropriate value.
For example, 1.7 works well with Maxar and Airbus imagery.
- Expand Pre-processing.
- Ensure that the Calculate Statistics check box is checked.
Note:
If statistics were previously calculated for the imagery using the Build Pyramids and Statistics geoprocessing tool, this step can be skipped.
- For Number of Columns to Skip and Number of Rows to Skip, ensure that the value is 1.
- Accept all other default values, and click Finish to create the workspace.
Once the Reality mapping workspace is created, the image collection is loaded in the workspace and displayed on the map. You can now perform block adjustments and generate Reality mapping products.
Related topics
- Reality mapping in ArcGIS Pro
- Add ground control points to a Reality mapping workspace
- Manage tie points in a Reality mapping workspace
- Perform a Reality mapping block adjustment
- Generate multiple products using ArcGIS Reality for ArcGIS Pro
- Introduction to the ArcGIS Reality for ArcGIS Pro extension
- Frequently asked questions