Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.praison.ai/llms.txt

Use this file to discover all available pages before exploring further.

LlmResponse

Defined in the protocols module.
Rust AI Agent SDK An LLM response.

Fields

NameTypeDescription
contentStringResponse content
tool_callsVec<ToolCall>Tool calls if any
usageOption<TokenUsage>Token usage

Source

View on GitHub

praisonai/src/protocols/mod.rs at line 199

Rust LLM

Rust Providers

Rust Failover