LlmConfig
Defined in the LLM Providers module.Rust AI Agent SDK LLM configuration
Fields
| Name | Type | Description |
|---|---|---|
model | String | Model name (e.g., “gpt-4o-mini”, “claude-3-sonnet”) |
Option | :is_none")] | API key (optional, can use env var) |
api_key | Option<String> | API key (optional, can use env var) |
Option | :is_none")] | Base URL (optional, for custom endpoints) |
base_url | Option<String> | Base URL (optional, for custom endpoints) |
temperature | f32 | Temperature (0.0 - 2.0) |
Option | :is_none")] | Max tokens |
max_tokens | Option<u32> | Max tokens |
Methods
new
| Name | Type |
|---|---|
model | impl Into<String> |
api_key
| Name | Type |
|---|---|
key | impl Into<String> |
base_url
| Name | Type |
|---|---|
url | impl Into<String> |
temperature
| Name | Type |
|---|---|
temp | f32 |
max_tokens
| Name | Type |
|---|---|
max | u32 |

