Skip to main content

Documentation Index

Fetch the complete documentation index at: https://braintrust.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Applies to:
  • Plan -
  • Deployment -

Summary

Issue: Invoking Amazon Nova models from Braintrust fails with an AccessDeniedException even when the IAM policy allows bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream on the foundation model:
AccessDeniedException: User: arn:aws:iam::XXXXXXXXXXXX:user/<user> is not authorized to perform: bedrock:InvokeModelWithResponseStream on resource: arn:aws:bedrock:us-east-1:XXXXXXXXXXXX:inference-profile/us.amazon.nova-2-lite-v1:0 because no identity-based policy allows the bedrock:InvokeModelWithResponseStream action
Cause: Nova models are invoked through cross-region inference profiles (for example, us.amazon.nova-2-lite-v1:0 or global.amazon.nova-pro-v1:0), not directly against the foundation model ARN. The IAM policy attached to your Bedrock credentials must grant invocation permissions on the inference profile resource in addition to the foundation model resource. Resolution: Update the IAM policy to include both resource types, then configure a Custom Bedrock provider in Braintrust with the exact inference-profile model identifier.

Resolution steps

Step 1: Update the IAM policy to cover both resources

Add both the foundation model and inference profile ARNs to the Resource list, and include bedrock:UseInferenceProfile so the principal can route through the inference profile.
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "BraintrustNovaInvoke",
      "Effect": "Allow",
      "Action": [
        "bedrock:InvokeModel",
        "bedrock:InvokeModelWithResponseStream",
        "bedrock:UseInferenceProfile",
        "bedrock:GetFoundationModel"
      ],
      "Resource": [
        "arn:aws:bedrock:us-east-1::foundation-model/amazon.nova-2-lite-v1:0",
        "arn:aws:bedrock:us-east-1::foundation-model/amazon.nova-pro-v1:0",
        "arn:aws:bedrock:us-east-1:XXXXXXXXXXXX:inference-profile/us.amazon.nova-2-lite-v1:0",
        "arn:aws:bedrock:us-east-1:XXXXXXXXXXXX:inference-profile/us.amazon.nova-pro-v1:0",
        "arn:aws:bedrock:us-east-1:XXXXXXXXXXXX:inference-profile/global.amazon.nova-2-lite-v1:0"
      ]
    }
  ]
}
The policies above are examples. Always validate the AWS Bedrock IAM documentation and/or your AWS admin for guidance on scoping permissions to your environment.
Replace XXXXXXXXXXXX with your AWS account ID, and add or remove inference profile ARNs to match the regions and Nova variants you plan to use.

Step 2: Match the inference profile prefix to the model identifier

Nova inference profiles are prefixed by routing scope:
  • us.amazon.nova-* — routes within US regions.
  • eu.amazon.nova-* — routes within EU regions.
  • global.amazon.nova-* — routes globally.
The model identifier configured in Braintrust must use the same prefix as the inference profile ARN granted in the policy. If you grant only us.amazon.nova-2-lite-v1:0 but call global.amazon.nova-2-lite-v1:0, the request fails with the same AccessDeniedException.

Step 3: Verify with a test invocation

After applying the policy, invoke the Nova model from Braintrust (playground or SDK). A successful response confirms the inference profile permissions are in place. If the request still fails, confirm the IAM principal in the error message matches the one you updated, and that the region in the inference profile ARN matches the region configured on the Braintrust AI provider.