Documentation Index
Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
Use this file to discover all available pages before exploring further.
Different User Interfaces:
| Interface | Description | URL |
|---|---|---|
| UI | Multi-Agent Systems Interface | https://docs.praison.ai/ui/ui |
| Chat | Chat with 100+ LLMs, single AI Agent | https://docs.praison.ai/ui/chat |
| Code | Chat with entire Codebase, single AI Agent | https://docs.praison.ai/ui/code |
Table of Contents
Install PraisonAI Code
AICoder Component Dependencies
The AICoder UI component requires all dependencies to be installed: Required dependencies:ModuleNotFoundError when trying to import the AICoder component, since all imports are unconditional in the module.
-
Username and Password will be asked for the first time.
adminis the default username and password. - Set Model name to be gpt-4o-mini in the settings
Other Models
- Use 100+ LLMs - Litellm
- Includes Gemini 1.5 for 2 Million Context Length
To Use Gemini 1.5
export GEMINI_API_KEY=xxxxxxxxxpraisonai code- Set Model name to be
gemini/gemini-1.5-flashin the settings
Ignore Files
Using .praisonignore
- Create a
.praisonignorefile in the root folder of the project - Add files to ignore
Using settings.yaml
(.praisonignore is preferred)- Create a
settings.yamlfile in the root folder of the project - Add below Variables and required Ignore Files
Using .env File
- Create a
.envfile in the root folder of the project - Add below Variables and required Ignore Files
Using Environment Variables in the Terminal
Include Files .praisoninclude
- Add files you wish to Include files in the context
- This will include the files/folders mentioned in
.praisonincludeto the original context (files in the folder - .gitignore - .praisonignore)
- Create a
.praisonincludefile in the root folder of the project - Add files to Include
Include ONLY these Files .praisoncontext (Context)
- Add files you wish to Include files in the context
- This will include ONLY the files/folders mentioned in
.praisoncontextto the context
- Create a
.praisoncontextfile in the root folder of the project - Add files to Include
Set Max Tokens
Note: By Default Max Tokens set is 900,000- Create a .env file in the root folder of the project
- Add below Variables and required Max Tokens
-
Default DB Location
~/.praisonai/database.sqlite
Key Features
Internet Search
PraisonAI Code now includes internet search capabilities using Crawl4AI and Tavily. This feature allows you to retrieve up-to-date information and code snippets during your coding sessions, enhancing your ability to find relevant programming information and examples. To use this feature:- Ask a question or request information about a specific coding topic
- The AI will use internet search to find the most relevant and current information
- You’ll receive code snippets, documentation references, or explanations based on the latest available resources
Vision Language Model (VLM) Support
While primarily designed for code interactions, PraisonAI Code also supports Vision Language Model capabilities. This feature can be particularly useful when dealing with visual aspects of programming, such as UI design, data visualization, or understanding code structure through diagrams. To use this feature:- Upload an image related to your coding query (e.g., a screenshot of a UI, a flowchart, or a code snippet image)
- Ask questions or request analysis based on the uploaded image
- The VLM will process the image and provide insights or answers based on its visual content, helping you understand or implement the visual concepts in your code
External Agents
The PraisonAI Code interface now includes sidebar toggles for external AI coding CLIs. These toggles appear automatically when the corresponding CLI tools are installed and available on your system PATH.Available External Agents
- Claude Code Toggle: Enable Claude Code CLI integration for advanced file editing and code analysis
- Gemini CLI Toggle: Enable Google Gemini CLI for code analysis and search capabilities
- Codex CLI Toggle: Enable OpenAI Codex CLI for code refactoring and optimization
- Cursor CLI Toggle: Enable Cursor CLI for IDE-style development tasks
How It Works
- Auto-Detection: Toggles only appear for CLIs that are installed and available via
shutil.which() - Sidebar Integration: Toggle switches appear in the Code UI sidebar settings
- Persistence: Toggle states persist across sessions via Chainlit settings
- Dynamic Instructions: Enabled agents modify the assistant’s system instructions
- Workspace Context: External agents operate against
PRAISONAI_CODE_REPO_PATH
Backward Compatibility
The new toggles replace the previous single “Enable Claude Code” switch. Legacy settings are automatically migrated:claude_code_enabledsetting →claude_enabledPRAISONAI_CLAUDECODE_ENABLEDenvironment variable →claude_enabled
For complete documentation on external agents UI toggles across all PraisonAI interfaces, see External Agents in UI.
Local Docker Development with Live Reload
To facilitate local development with live reload, you can use Docker. Follow the steps below:-
Create a
Dockerfile.dev: -
Create a
docker-compose.yml: -
Run Docker Compose:

