Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.praison.ai/llms.txt

Use this file to discover all available pages before exploring further.

LlmProtocol

Defined in the protocols module.
Rust AI Agent SDK Protocol for LLM provider implementations.

Methods

model

fn model(&self) -> &str
Get the model name

chat

async fn chat(&self, messages: &[LlmMessage]) -> Result<LlmResponse>
Chat with the LLM Parameters:
NameType
messages&[LlmMessage]

chat_with_tools

async fn chat_with_tools(
        &self,
        messages: &[LlmMessage],
        tools: &[ToolSchema],
    ) -> Result<LlmResponse>
Chat with tools Parameters:
NameType
messages&[LlmMessage]
tools&[ToolSchema]

Source

View on GitHub

praisonai/src/protocols/mod.rs at line 0

Rust LLM

Rust Providers

Rust Failover