Skip to main content
Agents can stream responses - see results word-by-word as they’re generated.

Quick Start

1

Stream Response

import { Agent } from 'praisonai';

const agent = new Agent({
  instructions: 'You tell engaging stories'
});

// Stream word by word
for await (const chunk of agent.stream('Tell me a story')) {
  process.stdout.write(chunk);
}
2

With Callback

await agent.chat('Explain quantum physics', {
  stream: true,
  onChunk: (chunk) => {
    displayInUI(chunk);
  }
});

User Interaction Flow


Configuration Levels

// Level 1: Bool - Enable streaming
const response = await agent.chat('Hello', {
  stream: true
});

// Level 2: Method - Use stream iterator
for await (const chunk of agent.stream('Hello')) {
  console.log(chunk);
}

// Level 3: Dict - With callbacks
await agent.chat('Hello', {
  stream: true,
  onChunk: (chunk) => console.log(chunk),
  onComplete: (full) => console.log('Done:', full)
});

When to Stream

ScenarioUse Streaming?
Long responses✅ Yes - better UX
Short answers❌ No - wait for complete
Real-time chat✅ Yes - feels natural
Data extraction❌ No - need complete output

API Reference

RealtimeConfig

Complete configuration options

RealtimeAgent

Full class documentation

Best Practices

Streaming helps most with responses over a few seconds.
Users may cancel mid-stream - handle gracefully.
When parsing JSON, wait for complete output.