Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.praison.ai/llms.txt

Use this file to discover all available pages before exploring further.

LlmProvider

Defined in the LLM module.
Rust AI Agent SDK Trait for LLM providers

Methods

chat

async fn chat(
        &self,
        messages: &[Message],
        tools: Option<&[ToolDefinition]>,
    ) -> Result<LlmResponse>
Send a chat completion request Parameters:
NameType
messages&[Message]
toolsOption&lt;&[ToolDefinition]&gt;

chat_stream

async fn chat_stream(
        &self,
        messages: &[Message],
        tools: Option<&[ToolDefinition]>,
    ) -> Result<Box<dyn futures::Stream<Item = Result<String>> + Send + Unpin>>
Stream a chat completion (returns chunks) Parameters:
NameType
messages&[Message]
toolsOption&lt;&[ToolDefinition]&gt;

model

fn model(&self) -> &str
Get the model name

Source

View on GitHub

praisonai/src/llm/mod.rs at line 0

Rust LLM

Rust Providers

Rust Failover