Overview
Ollama allows you to run embedding models locally on your machine with no API costs.Quick Start
CLI Usage
Setup
- Install Ollama: https://ollama.ai
- Pull an embedding model:
Available Models
| Model | Dimensions | Size |
|---|---|---|
ollama/nomic-embed-text | 768 | 274MB |
ollama/mxbai-embed-large | 1024 | 669MB |
ollama/all-minilm | 384 | 45MB |
ollama/snowflake-arctic-embed | 1024 | 669MB |

