Documentation Index
Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
Use this file to discover all available pages before exploring further.
Streaming
PraisonAI TypeScript SDK supports streaming responses, allowing you to receive agent output in real-time as it’s generated.
Quickstart
import { Agent } from 'praisonai';
const agent = new Agent({
instructions: "You are a helpful assistant",
stream: true // Enable streaming (default)
});
const response = await agent.chat("Tell me a story");
// Response streams to console in real-time
Configuration
Enable Streaming
Streaming is enabled by default:
const agent = new Agent({
instructions: "You are helpful",
stream: true // Default value
});
Disable Streaming
For batch processing or when you need the complete response:
const agent = new Agent({
instructions: "You are helpful",
stream: false
});
const response = await agent.chat("Hello");
// Returns complete response after generation finishes
Streaming Behavior
With Verbose Mode
When both stream and verbose are enabled:
const agent = new Agent({
instructions: "You are helpful",
stream: true,
verbose: true
});
await agent.chat("Explain AI");
// Output streams to console with formatting
Silent Streaming
Stream without console output:
const agent = new Agent({
instructions: "You are helpful",
stream: true,
verbose: false
});
const response = await agent.chat("Hello");
// Streams internally, returns complete response
Use Cases
Interactive Chat
const agent = new Agent({
instructions: "You are a conversational assistant",
stream: true,
verbose: true
});
// User sees response as it's generated
await agent.chat("Tell me about space exploration");
Long-Form Content
const writer = new Agent({
instructions: "You are a creative writer",
stream: true
});
// Stream a long story
await writer.chat("Write a 1000-word story about a robot");
Code Generation
const coder = new Agent({
instructions: "You are a code assistant",
stream: true
});
// Stream code as it's written
await coder.chat("Write a React component for a todo list");
Tool calls are handled automatically during streaming:
const getWeather = (city: string) => `Weather in ${city}: 22°C`;
const agent = new Agent({
instructions: "You are a weather assistant",
stream: true,
tools: [getWeather]
});
await agent.chat("What's the weather in Paris?");
// Tool is called, then response streams
When to Use Streaming
- Interactive applications: Real-time user feedback
- Long responses: Show progress during generation
- Chat interfaces: Natural conversation flow
When to Disable Streaming
- Batch processing: Processing many requests
- API endpoints: Need complete response for JSON
- Testing: Easier to validate complete responses
// Batch processing example
const agent = new Agent({
instructions: "Summarize text",
stream: false // Faster for batch
});
const summaries = await Promise.all(
documents.map(doc => agent.chat(`Summarize: ${doc}`))
);
Environment Variables
# Control default streaming behavior
PRAISON_STREAM=true
# Control verbose output
PRAISON_VERBOSE=true
Example: Streaming Chat Application
import { Agent } from 'praisonai';
import * as readline from 'readline';
const agent = new Agent({
instructions: "You are a helpful assistant",
stream: true,
verbose: true
});
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
async function chat() {
rl.question('You: ', async (input) => {
if (input.toLowerCase() === 'exit') {
rl.close();
return;
}
console.log('Assistant: ');
await agent.chat(input);
console.log('\n');
chat();
});
}
chat();
The TypeScript SDK streams responses via the stream: true configuration. For advanced event-based streaming with StreamEvent, StreamEventType, callbacks, and metrics (TTFT tracking), use the Python SDK streaming module.
See Also