Skip to main content

Cache

Cache provides response caching to reduce API costs and improve response times.

Available Providers

ProviderDescription
MemoryCacheIn-memory cache
FileCacheFile-based persistent cache

Quick Start

import { createMemoryCache, createFileCache } from 'praisonai';

// Memory cache
const memCache = createMemoryCache({
  maxSize: 1000,
  ttl: 3600000  // 1 hour
});

// File cache
const fileCache = createFileCache({
  directory: './cache',
  ttl: 86400000  // 24 hours
});

Configuration

interface CacheConfig {
  maxSize?: number;
  ttl?: number;  // Time to live in ms
}

Usage with Agents

import { Agent, createMemoryCache } from 'praisonai';

const cache = createMemoryCache({ ttl: 3600000 });

const agent = new Agent({
  name: 'CachedAgent',
  instructions: 'You are helpful.',
  cache
});

// First call - hits API
const response1 = await agent.chat('Hello');

// Second call - returns cached response
const response2 = await agent.chat('Hello');

CLI Usage

praisonai-ts cache info
praisonai-ts cache providers --json