Skip to main content
Enable response and prompt caching to improve performance and reduce API costs.

Quick Start

1

Simple Enable

Enable caching with defaults:
from praisonaiagents import Agent

agent = Agent(
    name="Cached Agent",
    instructions="Use caching",
    caching=True
)
2

With Configuration

Configure caching behavior:
from praisonaiagents import Agent
from praisonaiagents.config import CachingConfig

agent = Agent(
    name="Cached Agent",
    instructions="Use caching",
    caching=CachingConfig(
        enabled=True,
        prompt_caching=True
    )
)

Configuration Options

from praisonaiagents.config import CachingConfig

config = CachingConfig(
    # Response caching
    enabled=True,
    
    # Prompt caching (provider-specific)
    prompt_caching=None
)
ParameterTypeDefaultDescription
enabledboolTrueEnable response caching
prompt_cachingbool | NoneNoneEnable prompt caching (Anthropic, etc.)

Common Patterns

Pattern 1: Full Caching

from praisonaiagents import Agent
from praisonaiagents.config import CachingConfig

agent = Agent(
    name="Full Cache Agent",
    instructions="Maximum caching",
    caching=CachingConfig(
        enabled=True,
        prompt_caching=True
    )
)

Pattern 2: Disable Caching

from praisonaiagents import Agent
from praisonaiagents.config import CachingConfig

agent = Agent(
    name="No Cache Agent",
    instructions="Always fresh responses",
    caching=CachingConfig(enabled=False)
)

Best Practices

Anthropic Claude supports prompt caching for significant cost savings on repeated prompts.
Turn off caching when agents need fresh, real-time information.