External models are available on enterprise plans. Org admins, IT admins, and users with AI Studio full access roles can add and manage models.
How external models work
The Models page in AI Studio provides a unified governance layer for managing AI models across your organization. You can add models from external providers and control which teams have access to use them when building agents.
- Credential configuration: Administrators add provider credentials (API keys or IAM roles) to AI Studio
- Model selection: Administrators choose which models from the provider to enable
- Access control: Administrators assign model access to all teams or specific teams
- Agent building: Developers with access can select the model when building agents
- Request routing: When agents run, requests route through the configured provider credentials
Available providers
AI Studio supports external models from the following providers. Select a provider to view detailed configuration instructions.| Provider | Status | Auth type | Provider docs |
|---|---|---|---|
| AWS Bedrock | Available | Access keys or IAM Role ARN | View docs |
| HuggingFace | Coming soon | API key | — |
| Nvidia | Coming soon | API key | — |
AWS Bedrock models are currently supported in the following regions:
us-east-1, us-west-1, us-west-2, and eu-west-1. Additional regions may be added in future releases.Add an external model
Add external models in AI Studio under Models & Guardrails > Models.Configure provider credentials
Before adding a model, you need credentials from your provider. The credential type depends on the provider: Access key credentials (AWS Bedrock):- Create an IAM user with permissions to invoke Bedrock models
- Generate access keys for the IAM user
- See Configure AWS Bedrock credentials for detailed steps
- Create an IAM role with Bedrock permissions
- Configure a trust policy allowing Writer to assume the role
- Provides more granular access control without sharing long-term credentials
- See Configure AWS Bedrock credentials for detailed steps
Add a model in AI Studio
After obtaining your provider credentials:- Navigate to Models & Guardrails > Models in AI Studio
- Select + Add model
- Choose your provider (for example, AWS Bedrock)
- Enter your credentials:
- For access keys: Enter your access key ID and secret access key
- For Role ARN: Enter the IAM role ARN
- Select the AWS region where your models are deployed (supported regions:
us-east-1,us-west-1,eu-west-1) - Choose which models to enable from the available list
- Configure team access (all teams or specific teams)
- Select Add model to complete the setup

Manage team access
Control which teams can use external models when building agents.Configure model availability
When adding a model, you can set access to:- All teams: The model is immediately available to everyone with builder access
- Specific teams: Restrict the model to selected teams
Update team access
To view or update which teams can access a model:- Navigate to Models & Guardrails > Models in AI Studio
- View the current team access in the Team Access column
- Select the menu icon and choose Edit to update team access
Credentials are configured at the model level, not the team level. Team access controls who can use the model, but all authorized users share the same underlying provider credentials. Team members with access can use the models but cannot view the credentials.
Monitor model health
AI Studio automatically monitors the health of your configured models every 5 minutes and displays their status in the Models list.| Status | Description |
|---|---|
| Healthy | The model is responding correctly and credentials are valid |
| Unhealthy | The model is not responding. Check credentials and provider status |
View model health details
Select any model in the Models list to view detailed health information:- Health status: Current health state (Healthy or Unhealthy)
- Last checked: Timestamp of the most recent health check
- Model details: Provider, model name, and model ID
- Team access: Which teams can use this model
Automatic recovery
When a model becomes unhealthy, AI Studio temporarily removes it from the available models pool. After a cooldown period, the system automatically retries the model. If the underlying issue is resolved, the model returns to a healthy status without any manual intervention. You don’t need to delete or reconfigure a model that shows an unhealthy status due to transient issues like temporary provider outages. The system handles recovery automatically.Troubleshoot persistent issues
If a model remains unhealthy, check the following:- Credential validity: Credentials may have expired or been revoked. Update credentials if needed.
- Provider status: The provider service may be experiencing extended downtime. Check the provider’s status page.
- Regional availability: The model may not be available in the configured region.
- Permission changes: IAM policies may have been modified. Verify your IAM user or role still has the required permissions.
- Model deprecation: The provider may have deprecated the model. Check the provider’s documentation for model availability.
Manage credentials
AI Studio stores provider credentials as named credential sets that you can reuse across multiple models. You can manage credentials from the dedicated LLM Credentials page or create them when adding a model.
Create credentials from the LLM Credentials page
To create credentials before adding models:- Navigate to Models & Guardrails > LLM Credentials in AI Studio
- Select Add credentials
- Enter a Credential name (for example,
production-bedrockordev-aws-credentials) - Select the Provider (for example, Bedrock)
- Enter the provider-specific authentication details:
- For access keys: AWS Access Key ID and Secret Access Key
- For role assumption: AWS Role ARN and optional session name
- AWS Region Name
- Select Save

Enter credentials directly when adding a model
When adding a model, you can enter credentials directly in the Add Model form instead of selecting existing credentials. Credentials entered this way are stored with the model but are not saved as a named credential set for reuse with other models. To create reusable credentials that you can share across multiple models, use the LLM Credentials page instead.Reuse credentials across models
When adding additional models from the same provider:- Select your existing credentials from the Credentials name dropdown
- The stored authentication details are automatically applied
- You don’t need to re-enter access keys or other sensitive values
Update credentials
How you update expired or rotated credentials depends on how you originally configured them: Named credentials (created on the LLM Credentials page):- Navigate to Models & Guardrails > LLM Credentials in AI Studio
- Locate the credential in the list
- Select the menu icon and choose Edit credentials
- Update the authentication details and save
- Navigate to Models & Guardrails > Models in AI Studio
- Locate the model in the list
- Select the menu icon and choose Edit
- Update the credential values and save
Delete credentials
To delete a named credential:- Navigate to Models & Guardrails > LLM Credentials in AI Studio
- Locate the credential in the list
- Select the menu icon and choose Delete
Security best practices
Follow these practices when managing provider credentials:- Use dedicated service accounts rather than personal credentials
- Rotate credentials on a regular schedule (for example, every 90 days)
- Use descriptive credential names that indicate environment or purpose (for example,
prod-bedrock-us-east) - For AWS, prefer IAM Role ARN over access keys when possible
Edit a model
To update an external model’s configuration, including credentials and team access:- Navigate to Models & Guardrails > Models in AI Studio
- Locate the model in the list
- Select the menu icon and choose Edit
- Update the configuration and save
Delete a model
To remove an external model from AI Studio:- Navigate to Models & Guardrails > Models in AI Studio
- Locate the model in the list
- Select the menu icon and choose Delete
Use external models
Once you add an external model, it’s available to use just like Palmyra models.Agent Builder and no-code apps
In Agent Builder and no-code chat apps, external models appear in the Model dropdown alongside Palmyra models. Select any model your team has access to.API usage
External models use the same Writer API as Palmyra models. Use the List models endpoint to see all available models and their IDs, then pass the model ID to any endpoint that supports themodel parameter.
Next steps
- Configure AWS Bedrock: Set up AWS credentials and add Bedrock models
- Choose a model: Compare Palmyra models with external provider models
- Palmyra models: Explore Writer’s Palmyra model capabilities