> ## Documentation Index
> Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
> Use this file to discover all available pages before exploring further.

# Ollama

> Use local Ollama models with PraisonAI Agents

## Models

Run models locally with Ollama. Popular options:

* **Recommended**: `ollama/llama3.2` (latest Llama)
* **Reasoning**: `ollama/deepseek-r1` (reasoning model)
* **Small**: `ollama/qwen3` (efficient)
* **Code**: `ollama/codellama` (coding tasks)

## Setup

```bash  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Start Ollama server
ollama serve

# Pull a model
ollama pull llama3.2
```

## Python

```python  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
# No API key needed - runs locally
from praisonaiagents import Agent

agent = Agent(
    instructions="You are a helpful assistant",
    llm="ollama/llama3.2"
)
agent.start("Explain deep learning")
```

### With Tools

```python  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
from praisonaiagents import Agent

def read_file(path: str) -> str:
    """Read a file's contents."""
    with open(path, 'r') as f:
        return f.read()

agent = Agent(
    instructions="You are a code assistant",
    llm="ollama/codellama",
    tools=[read_file]
)
agent.start("Read and explain main.py")
```

### Multi-Agent

```python  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
from praisonaiagents import Agent, Task, AgentTeam

researcher = Agent(
    instructions="You research topics thoroughly",
    llm="ollama/llama3.2"
)
writer = Agent(
    instructions="You write clear summaries",
    llm="ollama/qwen3"
)

task1 = Task(description="Research Python best practices", agent=researcher)
task2 = Task(description="Write a guide", agent=writer)

agents = AgentTeam(agents=[researcher, writer], tasks=[task1, task2])
agents.start()
```

### DeepSeek Reasoning

```python  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
from praisonaiagents import Agent

agent = Agent(
    instructions="You are a problem solver",
    llm="ollama/deepseek-r1"
)
agent.start("Solve this math problem: What is 15% of 240?")
```

## CLI

```bash  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
# Basic prompt
python -m praisonai "Explain AI" --ollama llama3.2

# With specific model
python -m praisonai "Write code" --llm ollama/codellama

# Run agents.yaml
python -m praisonai
```

## YAML

```yaml  theme={"theme":{"light":"vitesse-light","dark":"vitesse-dark"}}
framework: praisonai
topic: Local AI development
agents:
  coder:
    role: Software Developer
    goal: Write clean code
    instructions: You are an expert programmer
    llm:
      model: ollama/codellama
    tasks:
      code_task:
        description: Write a Python function to sort a list
        expected_output: Clean, documented Python code

  reviewer:
    role: Code Reviewer
    goal: Review and improve code
    instructions: You review code for best practices
    llm:
      model: ollama/llama3.2
    tasks:
      review_task:
        description: Review the code and suggest improvements
        expected_output: Code review with suggestions
```


Built with [Mintlify](https://mintlify.com).