Edit

Share via


Supported programming languages for Azure AI Inference SDK

All models deployed to Azure AI Foundry Models support the Azure AI Model Inference API and its associated family of SDKs.

To use these SDKs, connect them to the Azure AI model inference URI (usually in the form https://<resource-name>.services.ai.azure.com/models).

Azure AI Inference package

The Azure AI Inference package allows you to consume all models deployed to the Azure AI Foundry resource and easily change among them. Azure AI Inference package is part of the Azure AI Foundry SDK.

Language Documentation Package Examples
C# Reference azure-ai-inference (NuGet) C# examples
Java Reference azure-ai-inference (Maven) Java examples
JavaScript Reference @azure/ai-inference (npm) JavaScript examples
Python Reference azure-ai-inference (PyPi) Python examples

Integrations

Framework Language Documentation Package Examples
LangChain Python Reference langchain-azure-ai (PyPi) Python examples
Llama-Index Python Reference llama-index-llms-azure-inference (PyPi)
llama-index-embeddings-azure-inference (PyPi)
Python examples
Semantic Kernel Python Reference semantic-kernel[azure] (PyPi) Python examples
AutoGen Python Reference autogen-ext[azure] (PyPi) Quickstart

Limitations

Warning

Cohere SDK and Mistral SDK aren't supported in Azure AI Foundry.

Next steps

  • To see what models are currently supported, check out the Models section