Skip to main content

LlmConfig

Defined in the LLM Providers module.
Rust AI Agent SDK LLM configuration

Fields

NameTypeDescription
modelStringModel name (e.g., “gpt-4o-mini”, “claude-3-sonnet”)
Option:is_none")]API key (optional, can use env var)
api_keyOption<String>API key (optional, can use env var)
Option:is_none")]Base URL (optional, for custom endpoints)
base_urlOption<String>Base URL (optional, for custom endpoints)
temperaturef32Temperature (0.0 - 2.0)
Option:is_none")]Max tokens
max_tokensOption<u32>Max tokens

Methods

new

fn new(model: impl Into<String>) -> Self
Create a new LLM config with the given model Parameters:
NameType
modelimpl Into&lt;String&gt;

api_key

fn api_key(mut self, key: impl Into<String>) -> Self
Set the API key Parameters:
NameType
keyimpl Into&lt;String&gt;

base_url

fn base_url(mut self, url: impl Into<String>) -> Self
Set the base URL Parameters:
NameType
urlimpl Into&lt;String&gt;

temperature

fn temperature(mut self, temp: f32) -> Self
Set the temperature Parameters:
NameType
tempf32

max_tokens

fn max_tokens(mut self, max: u32) -> Self
Set max tokens Parameters:
NameType
maxu32