ArcGIS Pro can be used with various GPU-enabled cloud-hosted virtual machines (VMs). Microsoft Azure offerings that can be used with ArcGIS Pro include the following:
These Azure virtual machines offer flexibility to support various use cases so you can allocate the needed resources, such as vCPU, RAM, and GPU. Azure virtual machines can be given the minimum specifications; however, the size can be increased within the Azure set increments. For example, NC4as_T4_v3 can be moved up to the NC8as_T4_v3, giving the VM more CPU and RAM, along with the NVIDIA T4 GPU that is allocated to the VM type.
The guidance below provides GIS and IT teams with a structured framework for selecting the most suitable Azure platform to run ArcGIS Pro, whether you're using stand-alone GPU VMs, pooled desktop sessions, cloud PCs, or on-premises Azure setups. Each option is clearly described with ideal use cases; VM sizing recommendations for light, medium, and heavy workloads; and detailed deployment steps that support choices like NVadsA10, NCasT4_v3, and NVv4 series GPUs for both single-session and multisession scenarios. There is also information about provisioning accelerators like the Mission Landing‑Zone Accelerator for governed deployments and Microsoft Windows 365 GPU Cloud PCs for rapid, turnkey provisioning, alongside Azure Stack HCI for latency-sensitive, sovereign environments. By aligning data governance requirements, user concurrency needs, and operational responsibilities, teams can optimize performance, manageability, and cost across all deployment models.
Learn more about Azure Virtual Desktop
Learn more about deploying ArcGIS Pro in Azure Virtual Desktop
Choose an Azure platform
Selecting an Azure environment determines where your ArcGIS Pro projects and associated geodatabases reside, how quickly you can onboard new users or scale capacity, and who is responsible for maintaining GPU drivers.
As the GIS or IT administrator, begin by figuring out which data needs to follow specific rules or regulations. Then, think about how many people will need access during busy times, such as wildfire season. Finally, decide whether your team or Microsoft will be responsible for updates and maintenance. Once these items are resolved, it's much easier to choose the correct Azure setup.
The table below describes the different Azure platforms, the ideal scenario for each, and key benefits and limitations.
| Platform | Scenario | Benefit | Limitation |
|---|---|---|---|
GPU-enabled Azure VM | Individual power users need root control or burst‑to‑cloud workloads. | Full administrative rights, ability to create image snapshots, and per‑second billing | One VM per session unless you add Remote Desktop Service Host (RDSH) services |
| Azure Virtual Desktop (AVD) | GPUs are being shared among many analysts via a pooled host pool. | Autoscale policies, FSLogix roaming profiles, and rich metrics | Requires profile tuning and autoscale rules |
| ArcGIS Pro Mission Landing-Zone Accelerator | Deploy a governed AVD quickly via code. | Image builder, NetApp files, and policy and Microsoft Azure Monitor workbooks out of the box | Needs Microsoft Azure Resource Manager (ARM) deployment using Bicep |
| Windows 365 GPU Cloud PC | Contractors or executives who need a turnkey desktop anywhere. | 30-minute provisioning, Microsoft-managed patches, no virtual network | Fixed stock-keeping units (SKUs), single-user desktop |
| Azure Local (Stack HCI) | Sovereign or edge sites demanding <5 ms latency with on-premises data. | AVD user experience next to the data when using Microsoft Azure Arc | Hardware capital expenditure, edge feature set in preview |
Additional considerations for platform deployments
When choosing a virtualization platform that uses Azure, there are considerations for licensing, network latency and user experience, system requirements, and real-world scenarios.
Licensing ArcGIS Pro in Azure
ArcGIS Pro supports three license models in Azure-based deployments: Named User, Single Use, and Concurrent Use. Named User licensing is tied to ArcGIS Online or ArcGIS Enterprise credentials, and enables access on up to three devices per user. Single Use licenses bind to a specific VM or Cloud PC, removing the need for sign-in. Concurrent Use licensing relies on a shared license pool managed by ArcGIS License Manager running in Azure or on-premises. Importantly, older licensing architectures have struggled with snapshots or VM cloning because of unstable machine identifiers, but such environments are supported more reliably using License Manager version 2019.2 and ArcGIS Pro 2.5 or later, however, snapshot restoration or moving VMs may still disrupt license bindings if they are not configured correctly
Note:
The Concurrent Use license type was deprecated on July 1, 2025. See the Concurrent Use License Type and ArcGIS License Manager deprecation notice for more information.
Network latency and user experience
Performance for ArcGIS Pro in remote desktop setups is especially sensitive to network latency, often outweighing bandwidth concerns. The user experience in this scenario typically requires a round-trip time (RTT) of ≤200 to support responsive pan/zoom interactions and real-time data access. The recommended network throughput varies by task: light workflows at approximately 1.5 Mbps, moderate usage at approximately 3 Mbps, and heavy 3D or analytical workloads may require 5 Mbps or more, scaling further with higher frame rates and resolutions. Deploying compute, storage, and user endpoints within the same Azure region or subnetwork helps optimize latency and responsiveness. Pilot testing using tools like the ArcGIS Pro Performance Assessment Tool is strongly advised to validate real-world performance before full-scale deployment.
System requirements and real-world performance
ArcGIS Pro requires sufficient local resources to operate effectively, even in cloud environments. The system requirements list 32 GB of RAM as the recommended baseline and 64 GB or more as optimal for heavy analytical workloads or complex rendering. A discrete GPU with at least 4 GB of dedicated memory is also advised. Insufficient CPU, RAM, or GPU allocation, especially in shared or undersized VMs, can lead to lag, delayed tool response, or application instability. Reports from real-world users highlight issues like multiple-second delays when switching layouts or navigating large projects on underpowered VMs. To avoid these pitfalls, it's essential to conduct pilot testing with typical user workflows and iteratively adjust VM sizing to balance cost and performance effectively.
GPU-enabled Azure VMs
When a GIS professional needs workstation-class performance without waiting for an entire VDI farm, an Azure GPU VM offers a solution. These VMs can be deployed on demand, giving users the power they need in minutes rather than days. With local administrative rights, you can install beta versions of ArcGIS Pro, test custom Python toolboxes, or configure complex environments for specific workflows—all in an isolated space that won't interfere with colleagues or shared production resources.
You can also capture and reuse template images (golden images) between projects, making it easy to re-create known good VM configurations or environments. As billing only applies while the VM is running, costs stop when you deallocate. This makes Azure GPU VMs ideal for short-term mapping sprints, after-hours analysis, and short-term or long-term workloads—turning what would traditionally be a capital expense into operational agility and pay-as-you-go efficiency.
Recommended SKUs for Azure VMs are the following:
- NVadsA10 v5—NVIDIA A10 slices
- NCasT4_v3—Up to 4 × T4 GPUs
- NVv4—Fractional AMD MI25 GPUs
For information on sizing, see Virtual machine sizing and user profiles.
To deploy a GPU-enabled Azure VM, complete the following steps.
- Create a VM in a region with capacity for an NV size. Choose an ArcGIS Pro 3.x image.
- Choose a SKU such as NVadsA10, NCasT4, or NVv4 and configure Premium SSD (Solid-State Drive) v2.
- Enable Accelerated Networking and add the NVIDA GPU Driver Extension.
- Using Remote Desktop, license ArcGIS Pro and run the ArcGIS Pro Performance Assessment Tool (PAT).
Azure Virtual Desktop (AVD) and AVD Accelerator
AVD is built for organizations that want workstation-level performance with more flexibility than a one-GPU-per-user setup. Instead of dedicating a GPU to each user, AVD allows you to share GPU resources across multiple users by brokering GPU-enabled VMs through Microsoft's control plane. This helps reduce costs while delivering smooth performance for demanding applications like ArcGIS Pro.
You can monitor key metrics such as frame rate, encoder latency, and user-level bandwidth through Azure Monitor to ensure a quality user experience. The environment is scalable: new analysts can be added by assigning them to a group, such as Entra ID, and virtual machines can automatically shut down when not in use. The result is a flexible, efficient cloud environment that balances performance and scalability.
For this platform, some of the design features include the following
- Latency goal of less than 200 ms round-trip time (RTT)
- Density approximately 1/8 GPU for light users and 1/3 GPU for power users
- Profiles include FSLogix on Azure Files Premium and data on NetApp
To deploy an Azure Virtual Desktop and AVD Accelerator, complete the following steps:
- Register the AVD provider using the Register-AzResourceProvider command.
- Create a host pool and workspace.
- Add session hosts with the appropriate GPU size and an ArcGIS Pro image.
- Configure the FSLogix path on Azure Files.
- Set the scaling plan and enable the UDP short path.
- Test performance with the ArcGIS Pro Performance Assessment Tool (PAT).
ArcGIS Pro Accelerator
For ArcGIS Pro users, Azure Virtual Desktop (AVD) provides a way to access GPU-accelerated desktops in the cloud. Instead of each analyst needing a dedicated GPU VM, AVD allows you to share GPU resources across multiple users, keeping costs predictable while still supporting 2D and 3D workflows.
The Mission Landing-Zone (MLZ) Accelerator adds ready-made templates, security policies, and monitoring so your IT team can stand up a governed AVD environment quickly. This means you can get ArcGIS Pro up and running quickly, without having to design all the networking, security, and deployment pieces yourself.
For ArcGIS Pro users, this platform may have benefits such as the following:
- Smooth 2D and 3D performance with shared GPU resources.
- User profiles and data can roam across sessions if you are using FSLogix with Azure Files Premium and data on NetApp.
- With built-in autoscaling, you only pay for the capacity you need.
- Out-of-the-box policies keep environments secure and reliable.
- Central dashboards track the performance of ArcGIS Pro for items such as frames per second, latency, and GPU usage.
This platform is ideal for organizations with multiple ArcGIS Pro analysts that want a managed, cost-efficient environment. It also allows you to quickly onboard new users by assigning them to the pool. This platform also allows for security governance and repeatable deployments.
For detailed ArcGIS Pro deployment steps, sample parameters, and scripts, see the ArcGIS section of the Azure Mission Landing-Zone repository.
Microsoft Windows 365 GPU Cloud PC
Windows 365 Cloud PCs provide each ArcGIS Pro analyst with a dedicated, GPU-accelerated desktop that has its patches and scaling managed by Microsoft. Since the desktop is delivered as a service, GIS administrators assign a Windows 365 license with the GPU add-on. The user can then start ArcGIS Pro from any device using Windows or a modern browser. This platform is ideal for those who need consistent ArcGIS Pro performance without the overhead of setting up an Azure Virtual Desktop host pool or joining a corporate virtual network.
Below are the different SKU types, including information about the virtual memory and the recommended workflow types.
| SKU | vCPU/RAM/vRAM | Recommended ArcGIS Pro workloads |
|---|---|---|
Standard | 4 GB/16 GB/8 GB | Basic 2D mapping, data review, use of dual 1080p monitors |
Super | 8 GB/56 GB/12 GB | Mixed 2D and 3D editing, terrain visualization up to 4K |
Max | 16 GB/110 GB/16 GB | Dense 3D scenes, lidar classification, deep learning tools |
To deploy a Microsoft Windows 365 GPU Cloud PC, complete the following steps:
- Assign licenses for Microsoft Windows 365 Enterprise and a GPU add-on for each analyst in Entra ID.
- Create a provisioning policy in Microsoft Intune. Select the GPU Cloud PC, and choose the SKU and Azure region closest to the users.
- Choose the image. From the Azure Compute Gallery, choose a custom image that contains ArcGIS Pro 3.x and the required Python packages.
- Assign the policy. Under Users/Groups, assign the policy for users.
- Verify that ArcGIS Pro runs as expected. The user signs in using the Windows application, starts ArcGIS Pro, and runs the ArcGIS Pro Performance Assessment Tool (PAT) for the baseline.
Azure Local
When ArcGIS Pro users need to work directly with local data, whether in secure sites, field offices, or places without reliable internet, Azure Local delivers GPU-accelerated desktops on-premises, managed through the Azure portal. It's ideal when the public cloud isn't an option due to connectivity, compliance, or large datasets.
With ArcGIS Pro running next to local geodatabases, imagery, or lidar, Azure Local delivers workstation-level speed, smooth 2D/3D mapping, fast rendering, and responsive analysis.
For ArcGIS Pro users, this platform may have benefits such as the following:
- Projects run directly against local file geodatabases, imagery, and lidar without waiting on cloud transfers.
- GPU-powered sessions provide workstation-level responsiveness for visualization and analysis.
- Profiles and caches roam with a user's profile through FSLogix, so analysts can reconnect quickly after moving between locations or if connections drop.
- ArcGIS Pro sessions remain usable on local clusters, even if the internet connection to the public cloud is slow or down.
- The IT team can monitor performance, apply policies, and manage virtual desktops through the same Azure portal interface used for cloud resources.
The GIS-optimized reference design for this platform includes the following for a single node:
- Compute: 1 Hyper-V server with Intel Xeon CPU and NVIDIA L40S GPU, partitioned into vGPU slices sized for ArcGIS Pro users.
- Storage: Local NVMe solid-state drives with separate folders for file geodatabases and raster caches.
- Profiles and caching: FSLogix profile containers are stored on the same node, with cloud cache enabled for quick reconnects.
- Networking: Local virtual network managed through Microsoft Azure Arc; an optional AVD Edge gateway can keep display traffic on-site.
- Monitoring: Azure Monitor agent and Log Analytics track GPU percentage, FPS, and storage latency.
To deploy Azure Local for ArcGIS Pro, complete the following steps:
- Install Azure Stack HCI 23H2 on validated GPU nodes and form the cluster.
- Register the cluster with Azure Arc using Connect-AzStackHci. Confirm a healthy status in the Azure portal.
- Enable GPU pass-through. Install the Windows Server GPU driver bundle. Configure the Discrete Device Assignment (DDA) or Hyper-V vGPU.
- Build an ArcGIS Pro image (version 3.x and GPU driver) with Azure VM Image Builder and publish it to an on-premises Azure Compute Gallery.
- Enable AVD for Azure Local (preview), set the on-premises cluster as the location for resources, and deploy the host pool sized per vGPU slices.
- Create server message block shares on the cluster for file geodatabases and raster caches, and map to session hosts using a group policy object.
- Configure an FSLogix profile path to an on-cluster share and enable a 4 GB cloud cache for resilience.
- Publish the ArcGIS Pro remote application from the Desktop Application group and assign Entra ID users and groups.
- Validate performance with the ArcGIS Pro Performance Assessment Tool (PAT) and monitor GPU percentage and latency in Azure Monitor.