Context Management
PraisonAI provides a complete context management system that prevents context overflow, optimizes token usage, and provides real-time visibility into what’s being sent to the model.Overview
Core Components
| Component | Purpose |
|---|---|
| Token Estimation | Fast offline token counting |
| Context Ledger | Token accounting per segment |
| Context Budgeter | Model limits and budget allocation |
| Context Composer | Message assembly with limits |
| Context Optimizer | Compaction strategies |
| Context Monitor | Real-time disk snapshots |
Agent-Centric Quick Start
The simplest way to enable context management is with thecontext= parameter:
Custom Configuration
Low-Level API (Advanced)
CLI Interactive Mode
Features
Token Estimation
Fast offline token counting with heuristic fallback
Context Ledger
Per-segment token accounting
Context Budgeter
Model limits and budget allocation
Context Optimizer
6 optimization strategies
Context Monitor
Real-time context snapshots
CLI Commands
/context commands and flags
CLI Flags
| Flag | Description | Default |
|---|---|---|
--context-auto-compact | Enable automatic compaction | true |
--context-strategy | Optimization strategy | smart |
--context-threshold | Trigger threshold (0.0-1.0) | 0.8 |
--context-monitor | Enable monitoring | false |
--context-monitor-path | Output file path | ./context.txt |
--context-monitor-format | Output format | human |
--context-output-reserve | Reserve for output | 8000 |
Environment Variables
Interactive Commands
| Command | Description |
|---|---|
/context | Show context stats |
/context show | Summary + budgets |
/context stats | Token ledger table |
/context budget | Budget allocation |
/context dump | Write snapshot now |
/context on | Enable monitoring |
/context off | Disable monitoring |
/context compact | Trigger optimization |
Multi-Agent Support
Next Steps
- Token Estimation - Learn about fast token counting
- Context Optimizer - Explore optimization strategies
- Context Monitor - Set up real-time monitoring

