Overview
PraisonAI uses LiteLLM under the hood, supporting 100+ LLM providers. Use the formatprovider/model-name for any supported model.
LiteLLM Provider Format
| Provider | Format | Example |
|---|---|---|
| OpenAI | gpt-* or openai/* | gpt-4o, openai/gpt-4o |
| Anthropic | claude-* | claude-sonnet-4-5 |
gemini/* | gemini/gemini-2.5-flash | |
| Azure | azure/* | azure/gpt-4 |
| AWS Bedrock | bedrock/* | bedrock/anthropic.claude-3-5-sonnet |
| Vertex AI | vertex_ai/* | vertex_ai/gemini-pro |
| Hugging Face | huggingface/* | huggingface/meta-llama/Llama-2-7b |
| Together AI | together_ai/* | together_ai/togethercomputer/llama-2-70b |
| Replicate | replicate/* | replicate/meta/llama-2-70b |
| Anyscale | anyscale/* | anyscale/meta-llama/Llama-2-70b |
Python (Generic Pattern)
OpenAI-Compatible Endpoints
LM Studio (Local)
vLLM Server
CLI
YAML
Custom Endpoint YAML
Environment Variables
Common environment variables for different providers:Resources
- LiteLLM Providers - Full list of supported providers
- LiteLLM Models - Model discovery and pricing

