Skip to main content

Lite Package CLI

Commands for using the lightweight praisonaiagents.lite package from the command line.

Commands

Show Package Info

Display information about the lite package:
praisonai lite info
Output:
praisonaiagents.lite - Lightweight Agent Package
==================================================

Classes: LiteAgent, LiteTask, LiteToolResult
Decorators: @tool
Helpers: create_openai_llm_fn, create_anthropic_llm_fn

Features:
  • BYO-LLM (Bring Your Own LLM)
  • Thread-safe chat history
  • Tool execution
  • No litellm dependency
  • Minimal memory footprint

Show Example Code

Display example usage code:
praisonai lite example
Output:
# Example: Using praisonaiagents.lite with custom LLM

from praisonaiagents.lite import LiteAgent, tool

# Define a custom LLM function
def my_llm(messages):
    # Your custom LLM implementation
    pass

# Or use the built-in OpenAI adapter
from praisonaiagents.lite import create_openai_llm_fn
llm_fn = create_openai_llm_fn(model="gpt-4o-mini")

# Create a lite agent
agent = LiteAgent(
    name="MyAgent",
    llm_fn=llm_fn,
    instructions="You are a helpful assistant."
)

# Chat with the agent
response = agent.chat("Hello!")
print(response)

Run Lite Agent

Run a lite agent with a prompt:
# Using OpenAI (default)
praisonai lite run "Hello, how are you?"

# Specify model
praisonai lite run "Hello" --model gpt-4o-mini

# Use Anthropic
praisonai lite run "Hello" --provider anthropic --model claude-3-5-sonnet-20241022
Options:
OptionDescriptionDefault
--modelModel name to usegpt-4o-mini
--providerLLM provider (openai, anthropic)openai

Environment Variables

Required environment variables based on provider:
# For OpenAI
export OPENAI_API_KEY="your-key"

# For Anthropic
export ANTHROPIC_API_KEY="your-key"

Examples

Quick Chat

# Simple chat with OpenAI
export OPENAI_API_KEY="your-key"
praisonai lite run "What is 2+2?"

Using Different Models

# GPT-4o
praisonai lite run "Explain quantum computing" --model gpt-4o

# Claude
praisonai lite run "Write a haiku" --provider anthropic --model claude-3-5-sonnet-20241022

Check Availability

# Verify lite package is available
praisonai lite info

Comparison with Full CLI

Featurepraisonaipraisonai lite
Multi-providerYes (via litellm)Manual only
Memory usage~93MB~5MB
Startup time~800ms~18ms
DependenciesManyMinimal