MCP
Ollama MCP Integration
Guide for integrating Ollama models with PraisonAI agents using MCP
Add Ollama Tool to AI Agent
Quick Start
1
Set Up Ollama
Make sure you have Ollama installed and running locally:
2
Create a file
Create a new file ollama_airbnb.py
with the following code:
3
Install Dependencies
Make sure you have Node.js installed, as the MCP server requires it:
4
Run the Agent
Execute your script:
Requirements
- Python 3.10 or higher
- Node.js installed on your system
- Ollama installed and running locally
Features
Local LLM
Run models locally using Ollama without relying on external APIs.
MCP Integration
Seamless integration with Model Context Protocol.
Airbnb Search
Search for accommodations on Airbnb with natural language queries.
Privacy-Focused
Keep sensitive data local with on-device inference.