Creating MCP Servers

This guide demonstrates how to create Model Context Protocol (MCP) servers using PraisonAI agents. MCP is a protocol that enables AI models to use tools and communicate with external systems in a standardized way.

Single Agent MCP Server

The simplest way to create an MCP server is with a single agent. This approach is ideal for specialized tasks where you need just one agent with a specific capability.

1

Install Dependencies

Make sure you have the required packages installed:

pip install "praisonaiagents[mcp]>=0.0.81"
2

Create a Simple MCP Server

Create a file named simple-mcp-server.py with the following code:

from praisonaiagents import Agent

agent = Agent(instructions="Create a Tweet based on the topic provided")
agent.launch(port=8080, protocol="mcp")
3

Run the Server

python simple-mcp-server.py

Your MCP server will be available at http://localhost:8080

Multi-Agent MCP Server with Custom Tools

For more complex scenarios, you can create an MCP server with multiple agents and custom tools. This approach allows for collaborative problem-solving and specialized capabilities.

1

Install Additional Dependencies

pip install "praisonaiagents[mcp]" duckduckgo-search
2

Create a Multi-Agent MCP Server

Create a file named simple-mcp-multi-agents-server.py with the following code:

from praisonaiagents import Agent, Agents
from duckduckgo_search import DDGS

def internet_search_tool(query: str):
    results = []
    ddgs = DDGS()
    for result in ddgs.text(keywords=query, max_results=5):
        results.append({
            "title": result.get("title", ""),
            "url": result.get("href", ""),
            "snippet": result.get("body", "")
        })
    return results

agent = Agent(instructions="You Search the internet for information", tools=[internet_search_tool])
agent2 = Agent(instructions="You Summarise the information")

agents = Agents(agents=[agent, agent2])
agents.launch(port=8080, protocol="mcp")
3

Run the Multi-Agent Server

python simple-mcp-multi-agents-server.py

Your multi-agent MCP server will be available at http://localhost:8080

Multi-Agent MCP Server (Simple)

For scenarios where you need multiple agents to collaborate without custom tools, you can create a simpler multi-agent MCP server:

from praisonaiagents import Agent, Agents

agent = Agent(instructions="You Search the internet for information")
agent2 = Agent(instructions="You Summarise the information")

agents = Agents(agents=[agent, agent2])
agents.launch(port=8080, protocol="mcp")

This approach is ideal for cases where you want agents with different specializations to work together using their built-in capabilities.

Connecting to MCP Servers

You can connect to MCP servers using various clients:

Using PraisonAI Agents

from praisonaiagents import Agent, MCP

client_agent = Agent(
    instructions="Use the MCP server to complete tasks",
    llm="gpt-4o-mini",
    tools=MCP("http://localhost:8080")
)

response = client_agent.start("Create a tweet about artificial intelligence")
print(response)

Using JavaScript/TypeScript

import { MCPClient } from '@modelcontextprotocol/client';

async function main() {
  const client = new MCPClient('http://localhost:8080');
  
  const response = await client.chat([
    { role: 'user', content: 'Create a tweet about artificial intelligence' }
  ]);
  
  console.log(response.choices[0].message.content);
}

main();

Advanced Configuration

Custom Port and Host

agent.launch(port=9000, host="0.0.0.0", protocol="mcp")

Authentication

agent.launch(port=8080, protocol="mcp", api_key="your-secret-key")

CORS Configuration

agent.launch(port=8080, protocol="mcp", cors_origins=["https://yourdomain.com"])

Deployment Options

For production deployments, consider:

  1. Docker Containerization:

    FROM python:3.11-slim
    
    WORKDIR /app
    
    COPY requirements.txt .
    RUN pip install --no-cache-dir -r requirements.txt
    
    COPY . .
    
    EXPOSE 8080
    
    CMD ["python", "simple-mcp-server.py"]
    
  2. Cloud Deployment: Deploy to AWS, Google Cloud, or Azure using their container services.

  3. Kubernetes: For scalable deployments, use Kubernetes to manage your MCP server containers.

Security Considerations

  1. API Authentication: Always use API keys in production
  2. Rate Limiting: Implement rate limiting to prevent abuse
  3. Input Validation: Validate all incoming requests
  4. HTTPS: Use SSL/TLS for all production deployments
  5. Tool Permissions: Limit what custom tools can access

Features and Benefits

Standardized Protocol

MCP provides a standardized way for AI models to interact with tools and services.

Custom Tools

Easily integrate custom tools like web search, database access, or API calls.

Multi-Agent Collaboration

Create systems where multiple specialized agents collaborate on complex tasks.

Language Agnostic

Connect to MCP servers from any programming language that supports HTTP.

Best Practices

  1. Agent Instructions: Provide clear, specific instructions for each agent
  2. Tool Documentation: Document your custom tools thoroughly
  3. Error Handling: Implement robust error handling in your tools
  4. Monitoring: Set up logging and monitoring for your MCP servers
  5. Testing: Test your MCP servers thoroughly before deployment

Was this page helpful?