Gemini CLI Integration
PraisonAI provides integration with Google’s Gemini CLI for AI-powered code analysis, generation, and refactoring tasks.Installation
Quick Start
Configuration Options
| Option | Type | Default | Description |
|---|---|---|---|
workspace | str | ”.” | Working directory for CLI execution |
timeout | int | 300 | Timeout in seconds |
output_format | str | ”json” | Output format: “json”, “text”, “stream-json” |
model | str | ”gemini-2.5-pro” | Gemini model to use |
include_directories | list | None | Additional directories to include in context |
sandbox | bool | False | Run in sandbox mode |
Examples
Basic Execution
Model Selection
Multi-Directory Context
With Usage Stats
Streaming Output
As Agent Tool
Environment Variables
CLI Flags Used
The integration uses the following Gemini CLI flags:| Flag | Description |
|---|---|
-p | Print mode (headless) |
-m | Model selection |
--output-format json | JSON output for parsing |
--include-directories | Include additional directories |
--sandbox | Run in sandbox mode |
JSON Output Schema
The JSON output includes:Error Handling
Best Practices
- Use gemini-2.5-flash for quick tasks
- Use gemini-2.5-pro for complex analysis
- Include relevant directories for better context
- Use execute_with_stats() to monitor usage
- Set appropriate timeouts for large codebases

