Configure Azure AI Provider
Note: This guide uses the canonical provider type for this platform (
claudefor Bedrock,geminifor Vertex,openaifor Azure). Other provider types are configurable but their request routing depends on PromptKit#1009.
This guide covers how to configure an Omnia Provider to use Azure AI Services (Azure OpenAI) for LLM access. Azure AI providers support two authentication methods: Azure AD Workload Identity for production use, and service principals for simpler setups.
Prerequisites
Section titled “Prerequisites”- An AKS cluster with OIDC issuer enabled
- An Azure AI (Azure OpenAI) resource deployed
azCLI installed and authenticated- Omnia operator installed in the cluster
Option 1: Workload Identity — Recommended
Section titled “Option 1: Workload Identity — Recommended”Azure AD Workload Identity lets Kubernetes pods authenticate as a managed identity without storing credentials. This is the recommended approach for production.
1. Create a managed identity
Section titled “1. Create a managed identity”az identity create \ --name omnia-azure-ai \ --resource-group my-resource-group \ --location eastusNote the clientId from the output — you’ll need it later.
2. Assign the Cognitive Services role
Section titled “2. Assign the Cognitive Services role”az role assignment create \ --assignee <managed-identity-client-id> \ --role "Cognitive Services OpenAI User" \ --scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>3. Establish a federated credential
Section titled “3. Establish a federated credential”Get the OIDC issuer URL from your AKS cluster:
az aks show \ --name my-cluster \ --resource-group my-resource-group \ --query "oidcIssuerProfile.issuerUrl" -o tsvCreate the federated credential:
az identity federated-credential create \ --name omnia-federated \ --identity-name omnia-azure-ai \ --resource-group my-resource-group \ --issuer <oidc-issuer-url> \ --subject system:serviceaccount:agents:omnia-agent \ --audiences api://AzureADTokenExchange4. Annotate the service account
Section titled “4. Annotate the service account”Configure the service account via Helm values:
# values.yamlserviceAccount: labels: azure.workload.identity/use: "true" annotations: azure.workload.identity/client-id: <managed-identity-client-id>5. Create the Provider
Section titled “5. Create the Provider”apiVersion: omnia.altairalabs.ai/v1alpha1kind: Providermetadata: name: azure-openai namespace: agentsspec: type: openai model: gpt-4o
platform: type: azure region: eastus endpoint: https://my-resource.openai.azure.com
auth: type: workloadIdentity
capabilities: - text - streaming - tools - json6. Verify
Section titled “6. Verify”kubectl get provider azure-openai -n agents -o widekubectl get provider azure-openai -n agents -o jsonpath='{.status.conditions}' | jq .Both the AuthConfigured and Ready conditions should be True.
Option 2: Service Principal
Section titled “Option 2: Service Principal”For development or environments without Workload Identity, you can use service principal credentials.
1. Create a service principal
Section titled “1. Create a service principal”az ad sp create-for-rbac \ --name omnia-azure-ai-sp \ --role "Cognitive Services OpenAI User" \ --scopes /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>2. Create a Secret
Section titled “2. Create a Secret”kubectl create secret generic azure-credentials \ --namespace agents \ --from-literal=AZURE_CLIENT_ID=<app-id> \ --from-literal=AZURE_CLIENT_SECRET=<password> \ --from-literal=AZURE_TENANT_ID=<tenant-id>3. Create the Provider
Section titled “3. Create the Provider”apiVersion: omnia.altairalabs.ai/v1alpha1kind: Providermetadata: name: azure-openai namespace: agentsspec: type: openai model: gpt-4o
platform: type: azure region: eastus endpoint: https://my-resource.openai.azure.com
auth: type: servicePrincipal credentialsSecretRef: name: azure-credentials
capabilities: - text - streaming - tools - jsonUsing with AgentRuntime
Section titled “Using with AgentRuntime”Reference the Provider from an AgentRuntime:
apiVersion: omnia.altairalabs.ai/v1alpha1kind: AgentRuntimemetadata: name: my-agent namespace: agentsspec: promptPackRef: name: my-prompts providerRef: name: azure-openai facade: type: websocket port: 8080Troubleshooting
Section titled “Troubleshooting”Endpoint URL format
Section titled “Endpoint URL format”The platform.endpoint must be the full Azure OpenAI resource URL, including https:// and the .openai.azure.com suffix:
https://my-resource.openai.azure.comDo not include a trailing slash or API version path.
Identity not federated
Section titled “Identity not federated”If using workload identity and the Provider shows AuthConfigured: False, verify the federated credential exists:
az identity federated-credential list \ --identity-name omnia-azure-ai \ --resource-group my-resource-groupEnsure the subject matches system:serviceaccount:<namespace>:<service-account-name>.
Role assignment missing
Section titled “Role assignment missing”Verify the managed identity or service principal has the correct role:
az role assignment list \ --assignee <client-id> \ --scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.CognitiveServices/accounts/<resource-name>Checking Provider conditions
Section titled “Checking Provider conditions”kubectl describe provider azure-openai -n agentsLook at the Conditions section for AuthConfigured, CredentialConfigured, and Ready.