AWS Bedrock
Instructions for integrating AWS Bedrock models with PraisonAI, including API setup and agent configuration
Add AWS Bedrock to PraisonAI
AWS Bedrock provides access to high-performing foundation models from leading AI companies like Anthropic, Cohere, Meta, Stability AI, and Amazon through a single API.
Setup
Prerequisites
Make sure you have AWS credentials configured:
Environment Variables
Set up your AWS credentials:
Using AWS Bedrock Models
Available Models
AWS Bedrock supports various model providers:
- Anthropic Claude:
bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
- Anthropic Claude Instant:
bedrock/anthropic.claude-instant-v1
- Amazon Titan:
bedrock/amazon.titan-text-express-v1
- Cohere Command:
bedrock/cohere.command-text-v14
- Meta Llama:
bedrock/meta.llama2-70b-chat-v1
agents.yaml Configuration
Python Code Example
IAM Permissions
Ensure your AWS IAM user/role has the necessary permissions to access Bedrock:
Regional Availability
AWS Bedrock is available in the following regions:
us-east-1
(N. Virginia)us-west-2
(Oregon)ap-southeast-1
(Singapore)ap-northeast-1
(Tokyo)eu-central-1
(Frankfurt)eu-west-3
(Paris)
Make sure to set your AWS_REGION
environment variable to a supported region.
Cost Optimization
AWS Bedrock charges are based on:
- Input tokens: Text sent to the model
- Output tokens: Text generated by the model
Consider using smaller models for development and testing to optimize costs.
Error Handling
Common errors and solutions:
- AccessDeniedException: Check your IAM permissions
- ResourceNotFoundException: Verify the model ID is correct and available in your region
- ThrottlingException: Implement retry logic with exponential backoff
- ValidationException: Check your input parameters and format
Advanced Configuration
Custom Endpoint
For specific regions or custom endpoints:
Streaming Responses
For real-time responses:
PraisonAI Chat | PraisonAI Code | PraisonAI (Multi-Agents) |
---|---|---|
LiteLLM | LiteLLM | Models |