Ollama Provider
Run models locally with Ollama.Environment Variables
Supported Modalities
| Modality | Supported |
|---|---|
| Text/Chat | ✅ |
| Embeddings | ✅ |
| Tools | ✅ |
Prerequisites
- Install Ollama: https://ollama.ai
- Pull a model:
ollama pull llama3.2
Quick Start
Available Models
| Model | Description |
|---|---|
llama3.2 | Llama 3.2 |
llama3.1 | Llama 3.1 |
mistral | Mistral 7B |
codellama | Code Llama |
phi3 | Phi-3 |

