Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Applies to:
SQL Server 2025 (17.x) Preview
Creates an external model object that contains the ___location, authentication method, and purpose of an AI model inference endpoint.
Syntax
Transact-SQL syntax conventions
CREATE EXTERNAL MODEL external_model_object_name
[ AUTHORIZATION owner_name ]
WITH
( LOCATION = '<prefix>://<path>[:<port>]'
, API_FORMAT = '<OpenAI, Azure OpenAI, etc>'
, MODEL_TYPE = EMBEDDINGS
, MODEL = 'text-embedding-ada-002'
[ , CREDENTIAL = <credential_name> ]
[ , PARAMETERS = '{"valid":"JSON"}' ]
);
Arguments
external_model_object_name
Specifies the user-defined name for the external model. The name must be unique within the database.
owner_name
Specifies the name of the user or role that owns the external model. If not specified, ownership is given to the current user. Depending on permissions and roles, explicit permission needs to be granted to users to use specific external models.
LOCATION
Provides the connectivity protocol and path to the AI model inference endpoint.
API_FORMAT
The API message format for the AI model inference endpoint provider.
Accepted values are:
Azure OpenAI
OpenAI
Ollama
MODEL_TYPE
The type of model being accessed from the AI model inference endpoint ___location.
Accepted values are:
EMBEDDINGS
MODEL
The specific model hosted by the AI provider. For example, text-embedding-ada-002
, text-embedding-3-large
, or o3-mini
.
CREDENTIAL
Indicate which DATABASE SCOPED CREDENTIAL
object is used with the AI model inference endpoint. More on accepted credential types and naming rules can be found in sp_invoke_external_rest_endpoint, or in the Remarks section of this article.
PARAMETERS
A valid JSON string that contains parameters to be appended to the AI model inference endpoint request message. For example:
'{"Dimensions" : 1536}'
Permissions
External model creation and altering
Requires ALTER ANY EXTERNAL MODEL
or CREATE EXTERNAL MODEL
database permission.
For example:
GRANT CREATE EXTERNAL MODEL TO [<PRINCIPAL>];
or
GRANT ALTER ANY EXTERNAL MODEL TO [<PRINCIPAL>];
External model grants
To use an external model in an AI function, a principal must be granted the ability to EXECUTE
it.
For example:
GRANT EXECUTE ON EXTERNAL MODEL::MODEL_NAME TO [<PRINCIPAL>];
GO
Retry count
If the embeddings call encounters HTTP status codes indicating temporary issues, you can configure the request to automatically retry. To specify the number of retries, add the following JSON to the PARAMETERS
on the EXTERNAL MODEL
. The NUMBER_OF_RETRIES should be a whole number between zero (0) and ten (10), inclusive, and can't be NULL or negative
{"sql_rest_options":{"retry_count":NUMBER_OF_RETRIES}}
For example, to set the retry count
to 3, you would write the following JSON string:
{"sql_rest_options":{"retry_count":3}}
Retry count with other parameters
Retry count can also be combined with other parameters as long as it's a valid JSON string.
{"Dimensions":725,"sql_rest_options":{"retry_count":5}}
Remarks
HTTPS and TLS
Only AI model inference endpoints that are configured to use HTTPS with TLS encryption protocol are supported for the LOCATION
parameter.
Accepted API formats and model types
The following sections outline accepted API Formats for each MODEL_TYPE
.
API_FORMAT for EMBEDDINGS
This table outlines the API Formats and URL endpoint structures for the EMBEDDINGS
model type. To view specific payload structures, use the link in the API Format column.
API format | Location path format |
---|---|
Azure OpenAI | https://{endpoint}/openai/deployments/{deployment-id}/embeddings?api-version={date} |
OpenAI | https://{server_name}/v1/embeddings |
Ollama | https://localhost:{port}/api/embed |
Create embedding endpoints
For more information on creating embedding endpoints, use these links for the appropriate AI model inference endpoint provider:
Credential name rules for EXTERNAL MODEL
The created DATABASE SCOPED CREDENTIAL
used by an EXTERNAL MODEL
must adhere to specific following rules:
- Must be a valid URL
- The URL ___domain must be one of those domains included in the allowlist
- The URL must not contain a query string
- Protocol + Fully Qualified Domain Name (FQDN) of the called URL must match Protocol + FQDN of the credential name
- Each part of the called URL path must match completely with the respective part of URL path in the credential name
- The credential must point to a path that is more generic than the request URL. For example, a credential created for path
https://northwind.azurewebsite.net/customers
can't be used for the URLhttps://northwind.azurewebsite.net
Collation and credential name rules
RFC 3986 Section 6.2.2.1 states that "When a URI uses components of the generic syntax, the component syntax equivalence rules always apply; namely, that the scheme and host are case-insensitive," and RFC 7230 Section 2.7.3 mentions that "all other are compared in a case-sensitive manner."
As there's a collation rule set at the database level, the following logic is applied, to be coherent with the database collation rule, and the RFC mentioned previously. (The described rule could potentially be more restrictive than the RFC rules, for example if database is set to use a case-sensitive collation):
- Check if the URL and credential match using the RFC, which means:
- Check the scheme and host using a case-insensitive collation (
Latin1_General_100_CI_AS_KS_WS_SC
) - Check all other segments of the URL are compared in a case-sensitive collation (
Latin1_General_100_BIN2
)
- Check the scheme and host using a case-insensitive collation (
- Check that the URL and credential match using the database collation rules (and without doing any URL encoding).
Managed Identity
To use Managed Identity for authentication on SQL Server 2025, you must enable the option by using sp_configure
with a user that is granted the ALTER SETTINGS server-level permission.
EXECUTE sp_configure 'allow server scoped db credentials', 1;
RECONFIGURE WITH OVERRIDE;
SCHEMABINDING
Dropping views created with SCHEMABINDING
and referencing an EXTERNAL MODEL
(such as a SELECT statement using AI_GENERATE_EMBEDDINGS
) is prevented with the Database Engine raising an error. The view definition itself must first be modified or dropped to remove dependencies referencing an EXTERNAL MODEL
.
Catalog View
External Model metadata is viewed by querying the sys.external_models
catalog view. Note, you must have access to a model to be able to view the metadata.
SELECT * FROM sys.external_models;
Examples
Create an EXTERNAL MODEL with Azure OpenAI using Managed Identity
This example creates an EXTERNAL MODEL of the EMBEDDINGS type using Azure OpenAI and uses Managed Identity for authentication.
Important
If using Managed Identity with Azure OpenAI and SQL Server 2025, the Cognitive Services OpenAI Contributor
role must be granted to SQL Server's system-assigned managed identity by ARC. For more information, see Role-based access control for Azure OpenAI in Azure AI Foundry Models.
-- Create access credentials to Azure OpenAI using a managed identity:
CREATE DATABASE SCOPED CREDENTIAL [https://my-azure-openai-endpoint.openai.azure.com/]
WITH IDENTITY = 'Managed Identity', secret = '{"resourceid":"https://cognitiveservices.azure.com"}';
GO
-- Create the EXTERNAL MODEL
CREATE EXTERNAL MODEL MyAzureOpenAiModel
AUTHORIZATION CRM_User
WITH (
LOCATION = 'https://my-azure-openai-endpoint.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-02-01',
API_FORMAT = 'Azure OpenAI',
MODEL_TYPE = EMBEDDINGS,
MODEL = 'text-embedding-ada-002',
CREDENTIAL = [https://my-azure-openai-endpoint.openai.azure.com/]
);
Create an EXTERNAL MODEL with Azure OpenAI using API keys and parameters
This example creates an EXTERNAL MODEL of the EMBEDDINGS type using Azure OpenAI and uses API Keys for authentication. The example also uses PARAMETERS
to set the Dimensions parameter at the endpoint to 725.
-- Create access credentials to Azure OpenAI using a key:
CREATE DATABASE SCOPED CREDENTIAL [https://my-azure-openai-endpoint.openai.azure.com/]
WITH IDENTITY = 'HTTPEndpointHeaders', secret = '{"api-key":"YOUR_AZURE_OPENAI_KEY"}';
GO
-- Create the EXTERNAL MODEL
CREATE EXTERNAL MODEL MyAzureOpenAiModel
AUTHORIZATION CRM_User
WITH (
LOCATION = 'https://my-azure-openai-endpoint.openai.azure.com/openai/deployments/text-embedding-3-small/embeddings?api-version=2024-02-01',
API_FORMAT = 'Azure OpenAI',
MODEL_TYPE = EMBEDDINGS,
MODEL = 'text-embedding-3-small',
CREDENTIAL = [https://my-azure-openai-endpoint.openai.azure.com/],
PARAMETERS = '{"Dimensions":725}'
);
Create an EXTERNAL MODEL with Ollama and an explicit owner
This example creates an EXTERNAL MODEL
of the EMBEDDINGS
type using Ollama hosted locally for development purposes.
CREATE EXTERNAL MODEL MyOllamaModel
AUTHORIZATION AI_User
WITH (
LOCATION = 'https://localhost:11435/api/embed',
API_FORMAT = 'Ollama',
MODEL_TYPE = EMBEDDINGS,
MODEL = 'all-minilm'
);
Create an EXTERNAL MODEL with OpenAI
This example creates an EXTERNAL MODEL
of the EMBEDDINGS
type using the OpenAI API_FORMAT
and HTTP header based credentials for authentication.
-- Create access credentials
CREATE DATABASE SCOPED CREDENTIAL [https://openai.com]
WITH IDENTITY = 'HTTPEndpointHeaders', secret = '{"Bearer":"YOUR_OPENAI_KEY"}';
GO
-- Create the external model
CREATE EXTERNAL MODEL MyAzureOpenAiModel
AUTHORIZATION CRM_User
WITH (
LOCATION = 'https://api.openai.com/v1/embeddings',
API_FORMAT = 'OpenAI',
MODEL_TYPE = EMBEDDINGS,
MODEL = 'text-embedding-ada-002',
CREDENTIAL = [https://openai.com]
);
Related content
- ALTER EXTERNAL MODEL (Transact-SQL)
- DROP EXTERNAL MODEL (Transact-SQL)
- AI_GENERATE_EMBEDDINGS (Transact-SQL)
- AI_GENERATE_CHUNKS (Transact-SQL)
- sys.external_models
- Create and deploy an Azure OpenAI Service resource
- Server configuration options
- Role-based access control for Azure OpenAI in Azure AI Foundry Models