The Create AI Service Connection File tool creates an AI Service Connection (.ais) file that stores connection information for an external AI service. Several deep learning packages use third-party AI models and need access to hosted AI services. To use these services, each provider requires details such as an endpoint, model name, or API key.
Instead of entering these details as model arguments every time, an .ais file containing configuration parameters can be used instead. This also ensures that the parameter values are not exposed in the History pane and are securely stored using operating system security. The .ais file is not shareable across machines or between users because the tool internally stores sensitive credentials in Windows Credential Manager and associates them with the local system and user account.
Using this approach reduces setup time, avoids errors, and provides a secure and consistent way to integrate external AI services with GIS workflows.
Service providers and configuration
The Create AI Service Connection File tool supports multiple AI service providers, and each provider requires specific configuration details. These details may include general information such as a model name or endpoint, as well as sensitive information like API keys or tokens.
The tool provides a set of required connection parameters for each service provider, which are specified using the Connection Parameters parameter. Sensitive values, such as API keys or tokens, are specified using the Secret Parameter Value parameter. Any value entered in the Secret Parameter Value parameter is treated as sensitive information and is saved in Windows Credential Manager, while non-sensitive information is stored in the .ais file.
The following sections show supported service providers and their respective connection parameters::
Note:
Each service provider comes with a default set of connection parameters, but additional parameters can be added as required by a specific model or workflow.
AWS
AWS provides AI models through Bedrock and related services. This option includes the following connection parameters:
- Access Key—A unique identifier associated with your AWS account, that is used to authenticate requests, for example: IAMAWSTESTKEY.
- Model ID—The identifier of the specific model hosted that you want to use, for example: amazon.titan-text-premier-v1:0
- Region Name—The AWS region where your service is deployed, for example: us-east-1
Anthropic
Anthropic provides the Claude family of models for text and reasoning tasks. This option includes the Model connection parameter, which is the name of the Anthropic model, for example: claude-3-opus.
Azure
Azure provides enterprise-hosted OpenAI models with configurable deployments and versions. This option includes the following connection parameters:
- Endpoint URI—The base URL of your Azure service, for example: https://mytestazureopenai.openai.azure.com/
- Deployment Name—The name of the model deployment you created in Azure, for example: gpt-4o
- API Version—The version of the Azure OpenAI API you are targeting, for example: 2024-05-01
Hugging Face
Hugging Face hosts thousands of open-source models for text, vision, and multi modal AI. This option includes the Model ID connection parameters, which is the identifier of the model you want to use from the Hugging Face Hub, for example: facebook/detr-resnet-50.
OpenAI
OpenAI provides models for language, reasoning, and multimodal tasks. This option includes the Model connection parameters, which is the name of the model you want to use, for example: gpt-4o-mini.
Google Cloud’s Vertex AI provides foundation models for text, chat, and vision tasks. This option includes the following connection parameters:
- Project ID—The Google Cloud project identifier, for example: my-gcp-project
- Region—The location where the service is hosted, for example: us-central1
- Model Name—The identifier of the specific model you want to use, for example: text-bison
Others
The Others option supports connections to other providers. Define your own parameter names and values to match the provider’s API requirements.
- Custom endpoint—The custom end point, for example: https://example.ai/api
- Model—The name of the model, for example: my-custom-model
AIS file and credential management
An .ais file contains connection details, such as model names, endpoints, and regions. These values are included in the .ais file in a structured format..
Here is a sample .ais file:{
"version": "1.0",
"serviceProvider": "AWS",
"protocol": "",
"host": "",
"authenticationScheme": "accessToken",
"authenticationProperties": {
"parameterType": "header",
"parameterName": "aws_secret_key"
},
"authenticationSecrets": {
"token": "test1234api567key89"
},
"serviceProviderProperties": {
"aws_access_key": " IAMAWSTESTKEY ",
"model_id": " amazon.titan-text-premier-v1:0",
"aws_region_name": "us-west-2"
}
}
Sensitive credentials, such as API keys or tokens, are not written directly to the .ais file. Instead, the file stores only a reference through the authenticationSecrets key, while the actual values are securely kept in Windows Credential Manager. This ensures that sensitive information never leaves the local machine and cannot be exposed by opening the .ais file.
Because the file references machine-specific credential entries, an .ais file cannot be shared across users or computers. If copied to another machine, it will not work unless the same credentials are re-entered and saved locally through the Create AI Service Connection File tool.
Use the .ais file
The saved .ais file can be provided as input to third-party AI models through their model arguments. The model reads the .ais file, extracts the required configuration details (such as connection parameters and credentials), and uses them to establish a connection with the hosted service provider.
Each third-party model that supports .ais files expects a defined set of connection parameters. If the parameters in the .ais file do not match what the model requires, the connection will fail, and the model may not function as expected.