Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.praison.ai/llms.txt

Use this file to discover all available pages before exploring further.

MockLlmProvider

Defined in the LLM module.
Rust AI Agent SDK Mock LLM provider for testing (no API calls)

Fields

NameTypeDescription
modelString-
responsesstd::sync::Mutex<Vec<String>>-
tool_callsstd::sync::Mutex<Vec<Vec<ToolCall>>>-

Methods

new

fn new() -> Self
Create a new mock provider

add_response

fn add_response(&self, response: impl Into<String>) -> ()
Add a response to return (FIFO queue) Parameters:
NameType
responseimpl Into&lt;String&gt;

add_tool_calls

fn add_tool_calls(&self, calls: Vec<ToolCall>) -> ()
Add tool calls to return with next response Parameters:
NameType
callsVec&lt;ToolCall&gt;

with_response

fn with_response(response: impl Into<String>) -> Self
Create with a single response Parameters:
NameType
responseimpl Into&lt;String&gt;

Source

View on GitHub

praisonai/src/llm/mod.rs at line 393

Rust LLM

Rust Providers

Rust Failover