Langfuse provides observability and evaluation tools for LLM applications with automatic tracing of all agent conversations.Documentation Index
Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
Use this file to discover all available pages before exploring further.
Path Comparison
Path A — obs.langfuse() | Path B — praisonai --observe langfuse | |
|---|---|---|
| Usage | Python script | CLI flag (also PRAISONAI_OBSERVE=langfuse) |
| Mechanism | Instruments OpenAI client globally via langfuse.openai drop-in | LangfuseSink + ContextTraceEmitter bridge |
| Span Coverage | Per-LLM-call generations (input/output, tokens, model) | Full lifecycle: agent_start, agent_end, tool_call_*, llm_* |
| Manual flush needed? | Yes — provider.flush() | No — atexit registers it |
| Best for | Programmatic agents, any Python flow | YAML / CLI workflows, multi-agent pipelines |
Quick Start
How Path A Works — obs.langfuse()
| Component | Purpose |
|---|---|
obs.langfuse() | Instruments OpenAI client globally for automatic tracing |
| Agent | Makes LLM calls that are automatically traced |
| Langfuse SDK | Captures traces via langfuse.openai drop-in |
Environment Variables
| Variable | Required | Description |
|---|---|---|
LANGFUSE_PUBLIC_KEY | ✅ | Your Langfuse public key (pk-lf-...) |
LANGFUSE_SECRET_KEY | ✅ | Your Langfuse secret key (sk-lf-...) |
LANGFUSE_BASE_URL | For self-hosted | Base URL e.g. http://localhost:3000 |
LANGFUSE_HOST | For compatibility | Same as LANGFUSE_BASE_URL |
As of PraisonAI’s wrapper-layer refactor,
OTEL_SDK_DISABLED and EC_TELEMETRY are
only set on first observability use, not at import. User-set values are preserved
(setdefault). If LANGFUSE_PUBLIC_KEY is set or ~/.praisonai/langfuse.env exists,
OTEL_SDK_DISABLED=false is set explicitly so Langfuse v4 can use OTel internally.CLI Observability — --observe langfuse
Enable full agent lifecycle tracing with a single CLI flag:
What Gets Traced
The CLI--observe langfuse captures:
- Agent lifecycle:
agent_start/agent_endspans - LLM interactions:
llm_request/llm_responsewith readable content - Tool usage:
tool_call_start/tool_call_endwith args and results - Automatic flush: No manual
provider.flush()required
As of PR #1461,
atexit auto-closes the sink — no manual flush required for CLI runs. See Custom Tracing for the underlying ContextTraceSinkProtocol.Programmatic — LangfuseSink + Context Bridge
For full control over Langfuse tracing in Python code:
The
set_context_emitter(... sink=sink.context_sink() ...) call is required for typical single-agent flows. Without it, only RouterAgent token-usage and PlanningAgent.plan_created events appear in Langfuse — Agent.start() lifecycle is silent.LangfuseSinkConfig Options
| Option | Type | Default | Description |
|---|---|---|---|
public_key | str | "" (then LANGFUSE_PUBLIC_KEY) | Langfuse public key (pk-lf-...) |
secret_key | str | "" (then LANGFUSE_SECRET_KEY) | Langfuse secret key (sk-lf-...) |
host | str | "" (then LANGFUSE_HOST → LANGFUSE_BASE_URL → https://cloud.langfuse.com) | Langfuse server URL |
flush_at | int | 20 | Number of events that triggers a flush |
flush_interval | float | 10.0 | Seconds between background flushes |
enabled | bool | True | Master switch |
CLI Server Commands
- Local Server
- Configuration
- View Traces
Common Patterns
Multi-Agent Tracing
All agents in a session share the same Langfuse context automatically:Connection Verification
Configuration File Usage
Credentials from~/.praisonai/langfuse.env are auto-loaded:
Best Practices
Always Flush Before Exit (Path A)
Always Flush Before Exit (Path A)
Call Path B (
provider.flush() to ensure all traces are sent for obs.langfuse():--observe langfuse) auto-registers atexit.close since PR #1461.Use Auto-Detection
Use Auto-Detection
Prefer
obs.auto() for environment-based configuration:Trace Content Quality (PR #1461)
Trace Content Quality (PR #1461)
As of PR #1461,
llm_response spans contain the assistant message text (or [tool_calls: name1, name2] summary), not the raw ChatCompletion(...) repr. The Langfuse “Output” panel is now human-readable:- Before:
ChatCompletion(id='chatcmpl-...', choices=[Choice(...)], ...) - After:
"The capital of France is Paris."or[tool_calls: search_web, calculator]
Context Bridge for Full Coverage
Context Bridge for Full Coverage
For programmatic usage, include the context emitter for complete lifecycle tracing:Without this, only
RouterAgent and PlanningAgent events appear — Agent.start() flows are silent.Local Development Setup
Local Development Setup
Use CLI for local development:
praisonai langfuse startpraisonai langfuse config- Test with
praisonai langfuse test
Related
Observability Overview
Compare observability providers
Agent Configuration
Configure agent settings

