Documentation Index
Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
Use this file to discover all available pages before exploring further.
LlmConfig
Defined in the LLM module.
Rust AI Agent SDK
LLM configuration
Fields
| Name | Type | Description |
|---|
model | String | Model name (e.g., “gpt-4o-mini”, “claude-3-sonnet”) |
Option | :is_none")] | API key (optional, can use env var) |
api_key | Option<String> | API key (optional, can use env var) |
Option | :is_none")] | Base URL (optional, for custom endpoints) |
base_url | Option<String> | Base URL (optional, for custom endpoints) |
temperature | f32 | Temperature (0.0 - 2.0) |
Option | :is_none")] | Max tokens |
max_tokens | Option<u32> | Max tokens |
Methods
new
fn new(model: impl Into<String>) -> Self
Create a new LLM config with the given model
Parameters:
| Name | Type |
|---|
model | impl Into<String> |
api_key
fn api_key(mut self, key: impl Into<String>) -> Self
Set the API key
Parameters:
| Name | Type |
|---|
key | impl Into<String> |
base_url
fn base_url(mut self, url: impl Into<String>) -> Self
Set the base URL
Parameters:
| Name | Type |
|---|
url | impl Into<String> |
temperature
fn temperature(mut self, temp: f32) -> Self
Set the temperature
Parameters:
max_tokens
fn max_tokens(mut self, max: u32) -> Self
Set max tokens
Parameters:
Source
View on GitHub
praisonai/src/llm/mod.rs at line 165