Skip to main content

LlmProtocol

Defined in the protocols module.
Rust AI Agent SDK Protocol for LLM provider implementations.

Methods

model

fn model(&self) -> &str
Get the model name

chat

async fn chat(&self, messages: &[LlmMessage]) -> Result<LlmResponse>
Chat with the LLM Parameters:
NameType
messages&[LlmMessage]

chat_with_tools

async fn chat_with_tools(
        &self,
        messages: &[LlmMessage],
        tools: &[ToolSchema],
    ) -> Result<LlmResponse>
Chat with tools Parameters:
NameType
messages&[LlmMessage]
tools&[ToolSchema]

Source

View on GitHub

praisonai/src/protocols/mod.rs at line 0