Skip to content

Configure Azure AI Provider

Note: This guide uses the canonical provider type for this platform (claude for Bedrock, gemini for Vertex, openai for Azure). Other provider types are configurable but their request routing depends on PromptKit#1009.

This guide covers how to configure an Omnia Provider to use Azure AI Services (Azure OpenAI) for LLM access. Azure AI providers support two authentication methods: Azure AD Workload Identity for production use, and service principals for simpler setups.

  • An AKS cluster with OIDC issuer enabled
  • An Azure AI (Azure OpenAI) resource deployed
  • az CLI installed and authenticated
  • Omnia operator installed in the cluster
Section titled “Option 1: Workload Identity — Recommended”

Azure AD Workload Identity lets Kubernetes pods authenticate as a managed identity without storing credentials. This is the recommended approach for production.

Terminal window
az identity create \
--name omnia-azure-ai \
--resource-group my-resource-group \
--location eastus

Note the clientId from the output — you’ll need it later.

Terminal window
az role assignment create \
--assignee <managed-identity-client-id> \
--role "Cognitive Services OpenAI User" \
--scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>

Get the OIDC issuer URL from your AKS cluster:

Terminal window
az aks show \
--name my-cluster \
--resource-group my-resource-group \
--query "oidcIssuerProfile.issuerUrl" -o tsv

Create the federated credential:

Terminal window
az identity federated-credential create \
--name omnia-federated \
--identity-name omnia-azure-ai \
--resource-group my-resource-group \
--issuer <oidc-issuer-url> \
--subject system:serviceaccount:agents:omnia-agent \
--audiences api://AzureADTokenExchange

Configure the service account via Helm values:

# values.yaml
serviceAccount:
labels:
azure.workload.identity/use: "true"
annotations:
azure.workload.identity/client-id: <managed-identity-client-id>
apiVersion: omnia.altairalabs.ai/v1alpha1
kind: Provider
metadata:
name: azure-openai
namespace: agents
spec:
type: openai
model: gpt-4o
platform:
type: azure
region: eastus
endpoint: https://my-resource.openai.azure.com
auth:
type: workloadIdentity
capabilities:
- text
- streaming
- tools
- json
Terminal window
kubectl get provider azure-openai -n agents -o wide
kubectl get provider azure-openai -n agents -o jsonpath='{.status.conditions}' | jq .

Both the AuthConfigured and Ready conditions should be True.

For development or environments without Workload Identity, you can use service principal credentials.

Terminal window
az ad sp create-for-rbac \
--name omnia-azure-ai-sp \
--role "Cognitive Services OpenAI User" \
--scopes /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>
Terminal window
kubectl create secret generic azure-credentials \
--namespace agents \
--from-literal=AZURE_CLIENT_ID=<app-id> \
--from-literal=AZURE_CLIENT_SECRET=<password> \
--from-literal=AZURE_TENANT_ID=<tenant-id>
apiVersion: omnia.altairalabs.ai/v1alpha1
kind: Provider
metadata:
name: azure-openai
namespace: agents
spec:
type: openai
model: gpt-4o
platform:
type: azure
region: eastus
endpoint: https://my-resource.openai.azure.com
auth:
type: servicePrincipal
credentialsSecretRef:
name: azure-credentials
capabilities:
- text
- streaming
- tools
- json

Reference the Provider from an AgentRuntime:

apiVersion: omnia.altairalabs.ai/v1alpha1
kind: AgentRuntime
metadata:
name: my-agent
namespace: agents
spec:
promptPackRef:
name: my-prompts
providerRef:
name: azure-openai
facade:
type: websocket
port: 8080

The platform.endpoint must be the full Azure OpenAI resource URL, including https:// and the .openai.azure.com suffix:

https://my-resource.openai.azure.com

Do not include a trailing slash or API version path.

If using workload identity and the Provider shows AuthConfigured: False, verify the federated credential exists:

Terminal window
az identity federated-credential list \
--identity-name omnia-azure-ai \
--resource-group my-resource-group

Ensure the subject matches system:serviceaccount:<namespace>:<service-account-name>.

Verify the managed identity or service principal has the correct role:

Terminal window
az role assignment list \
--assignee <client-id> \
--scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>
Terminal window
kubectl describe provider azure-openai -n agents

Look at the Conditions section for AuthConfigured, CredentialConfigured, and Ready.