Skip to main content

Multi-Provider Agents

PraisonAI agents can seamlessly switch between LLM providers like OpenAI, Anthropic, Google, and more. Under the hood, agents are powered by the AI SDK for multi-provider support, but you interact with the simple Agent abstraction.

Why Multi-Provider?

  • Flexibility: Switch providers without changing your agent code
  • Cost optimization: Use different models for different tasks
  • Redundancy: Fall back to alternative providers if one is unavailable
  • Best-of-breed: Use the best model for each use case

Installation

npm install praisonai
For additional providers (optional):
npm install @ai-sdk/anthropic @ai-sdk/google @ai-sdk/groq

Quick Start

import { Agent } from 'praisonai';

// Create an agent with any provider
const agent = new Agent({
  instructions: 'You are a helpful assistant.',
  llm: 'openai/gpt-4o-mini'  // or 'anthropic/claude-3-haiku-20240307'
});

// Chat with the agent
const response = await agent.chat('Hello, how are you?');
console.log(response);

Supported Providers

ProviderModel StringExamples
OpenAIopenai/modelopenai/gpt-4o, openai/gpt-4o-mini
Anthropicanthropic/modelanthropic/claude-3-5-sonnet-latest
Googlegoogle/modelgoogle/gemini-2.0-flash
Groqgroq/modelgroq/llama-3.3-70b-versatile
Mistralmistral/modelmistral/mistral-large-latest
Coherecohere/modelcohere/command-r-plus
DeepSeekdeepseek/modeldeepseek/deepseek-chat
xAIxai/modelxai/grok-2

Agent with Different Providers

import { Agent } from 'praisonai';

// OpenAI Agent
const openaiAgent = new Agent({
  instructions: 'You are a creative writer.',
  llm: 'openai/gpt-4o-mini'
});

// Anthropic Agent
const claudeAgent = new Agent({
  instructions: 'You are a code reviewer.',
  llm: 'anthropic/claude-3-5-sonnet-latest'
});

// Google Agent
const geminiAgent = new Agent({
  instructions: 'You are a research assistant.',
  llm: 'google/gemini-2.0-flash'
});

// Use each agent
const story = await openaiAgent.chat('Write a short story');
const review = await claudeAgent.chat('Review this code: function add(a,b) { return a+b }');
const research = await geminiAgent.chat('Explain quantum computing');

Agent with Streaming

import { Agent } from 'praisonai';

const agent = new Agent({
  instructions: 'You are a poet.',
  llm: 'anthropic/claude-3-haiku-20240307',
  stream: true
});

// Streaming output
const response = await agent.chat('Write a haiku about coding');
// Output streams to console automatically

Agent with Tools

import { Agent } from 'praisonai';

// Define a tool as a simple function
function getWeather(city: string): string {
  return JSON.stringify({ city, temperature: 22, condition: 'sunny' });
}

const agent = new Agent({
  instructions: 'You help users check the weather.',
  llm: 'openai/gpt-4o-mini',
  tools: [getWeather]
});

const response = await agent.chat('What is the weather in Paris?');
console.log(response); // Uses the tool and returns weather info

Agent with Structured Output

import { Agent } from 'praisonai';
import { z } from 'zod';

const PersonSchema = z.object({
  name: z.string(),
  age: z.number(),
  city: z.string()
});

const agent = new Agent({
  instructions: 'Extract person information from text.',
  llm: 'openai/gpt-4o-mini',
  outputSchema: PersonSchema
});

const result = await agent.chat('John is 30 years old and lives in Paris');
// Returns: { name: 'John', age: 30, city: 'Paris' }

Multi-Agent Workflows

import { Agent } from 'praisonai';

// Research agent using Claude
const researcher = new Agent({
  name: 'researcher',
  instructions: 'You research topics thoroughly.',
  llm: 'anthropic/claude-3-5-sonnet-latest'
});

// Writer agent using GPT-4
const writer = new Agent({
  name: 'writer',
  instructions: 'You write engaging content based on research.',
  llm: 'openai/gpt-4o'
});

// Workflow: Research then write
const research = await researcher.chat('Research the history of AI');
const article = await writer.chat(`Write an article based on: ${research}`);

Backend Selection

Agents automatically select the best backend:
ProviderBackend Used
OpenAINative OpenAI SDK (fastest)
AnthropicAI SDK
GoogleAI SDK
Other providersAI SDK
Override with environment variable:
export PRAISONAI_BACKEND=ai-sdk  # Force AI SDK for all providers
export PRAISONAI_BACKEND=native  # Force native providers only
export PRAISONAI_BACKEND=auto    # Auto-select (default)

Environment Variables

Set API keys for your providers:
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=AIza...
export GROQ_API_KEY=gsk_...

CLI Usage

Run agents from the command line:
# Chat with different providers
npx praisonai chat "Hello" --model openai/gpt-4o-mini
npx praisonai chat "Hello" --model anthropic/claude-3-haiku-20240307
npx praisonai chat "Hello" --model google/gemini-2.0-flash

# Streaming
npx praisonai chat "Write a poem" --model openai/gpt-4o-mini --stream

# With tools
npx praisonai run agent.ts --model anthropic/claude-3-5-sonnet-latest

Troubleshooting

Provider not found

Ensure you have the provider package installed:
npm install @ai-sdk/anthropic  # For Anthropic
npm install @ai-sdk/google     # For Google

API key errors

Check your environment variables are set correctly:
echo $OPENAI_API_KEY
echo $ANTHROPIC_API_KEY

Model not available

Verify the model name matches the provider’s naming convention.
The AI SDK backend is used internally to provide multi-provider support. You typically don’t need to interact with it directly.
// Internal - not recommended for direct use
import { resolveBackend } from 'praisonai';

const { provider, source } = await resolveBackend('anthropic/claude-3-haiku-20240307');
console.log(source); // 'ai-sdk' or 'legacy'
The backend resolver automatically:
  • Detects installed AI SDK provider packages
  • Falls back to native providers when AI SDK is unavailable
  • Injects attribution headers for multi-agent tracing

Next Steps