Skip to main content
Braintrust integrates with Azure AI Foundry, giving you access to OpenAI models and the full Azure model catalog, including models from Grok, Anthropic, DeepSeek, and more.
Azure OpenAI Service is also supported through the same configuration.

Deploy the model

In Azure AI Foundry, deploy the model you want to use.

Configure the integration

  1. In your Braintrust project, go to Settings.
  2. Under Project, select Project AI providers.
  3. Under Cloud providers, click Azure.
  4. Choose your authentication method:
    • API Key: Paste your Azure API key.
    • Entra API (Azure AD): Provide your Entra ID credentials to authenticate via Azure Active Directory
    API keys are stored as one-way cryptographic hashes, never in plaintext.
  5. Set API base URL to the base URL of your project endpoint in Azure. For example, for the project endpoiont https://john-3396-resource.services.ai.azure.com/api/projects/john-3396, you’d use https://john-3396-resource.services.ai.azure.com.
  6. Depending on the model you deployed, configure model details:
    • OpenAI models: Braintrust’s built-in Azure registry includes OpenAI models (such as the GPT series). To access these models through Braintrust, enable Include the default registry of Azure models.
    • Other models: If you deployed any non-OpenAI model from the Azure model catalog, such as Grok, Claude, or DeepSeek, configure the model details:
      • Set Models to the exact deployment name in Azure.
      • Click and select the Format based on the API the model supports:
        API typeBraintrust format
        Chat completionOpenAI
        ResponsesOpenAI
        MessagesAnthropic
      For full configuration options, see Custom providers.
      Non-OpenAI models appear in the Braintrust model selector but cannot be used directly from the UI. Use them via the Braintrust API or SDK instead.
  7. Click Save.

Additional resources