Streaming
PraisonAI TypeScript SDK supports streaming responses, allowing you to receive agent output in real-time as it’s generated.Quickstart
Configuration
Enable Streaming
Streaming is enabled by default:Disable Streaming
For batch processing or when you need the complete response:Streaming Behavior
With Verbose Mode
When bothstream and verbose are enabled:
Silent Streaming
Stream without console output:Use Cases
Interactive Chat
Long-Form Content
Code Generation
Streaming with Tools
Tool calls are handled automatically during streaming:Performance Considerations
When to Use Streaming
- Interactive applications: Real-time user feedback
- Long responses: Show progress during generation
- Chat interfaces: Natural conversation flow
When to Disable Streaming
- Batch processing: Processing many requests
- API endpoints: Need complete response for JSON
- Testing: Easier to validate complete responses
Environment Variables
Example: Streaming Chat Application
See Also
- Agent - Agent class documentation
- Streaming CLI - CLI streaming options
- Providers - LLM provider configuration

