Documentation Index
Fetch the complete documentation index at: https://docs.praison.ai/llms.txt
Use this file to discover all available pages before exploring further.
Deploy single or multi-agent systems as MCP (Model Context Protocol) servers for Claude Desktop, Cursor, and other MCP clients.
Quick Start
Install Dependencies
pip install "praisonaiagents[mcp]"
Set API Key
export OPENAI_API_KEY="your-key"
Create Agent MCP Server
from praisonaiagents import Agent
agent = Agent(
name="TweetAgent",
instructions="Create engaging tweets",
llm="gpt-4o-mini"
)
agent.launch(port=8080, protocol="mcp")
Verify
curl http://localhost:8080/sse
Single Agent as MCP
from praisonaiagents import Agent
agent = Agent(
name="TweetAgent",
instructions="Create engaging tweets",
llm="gpt-4o-mini"
)
agent.launch(port=8080, protocol="mcp")
Expected Output:
🚀 Agent 'TweetAgent' MCP server starting on http://0.0.0.0:8080
📡 MCP SSE endpoint available at /sse
📢 MCP messages post to /messages/
🛠️ Available MCP tools: execute_tweetagent_task
Multi-Agent as MCP
from praisonaiagents import Agent, AgentTeam
researcher = Agent(name="Researcher", instructions="Research topics", llm="gpt-4o-mini")
writer = Agent(name="Writer", instructions="Write content", llm="gpt-4o-mini")
agents = AgentTeam(agents=[researcher, writer])
agents.launch(port=8080, protocol="mcp")
Expected Output:
🚀 Agents MCP Workflow server starting on http://0.0.0.0:8080
📡 MCP SSE endpoint available at /sse
📢 MCP messages post to /messages/
🛠️ Available MCP tools: execute_workflow
🔄 Agents in MCP workflow: Researcher, Writer
MCP Endpoints
| Endpoint | Method | Description |
|---|
/sse | GET | SSE connection for MCP |
/messages/ | POST | Send MCP messages |
launch() Parameters
| Parameter | Type | Default | Description |
|---|
port | int | 8000 | Server port |
host | str | 0.0.0.0 | Server host |
protocol | str | mcp | Must be mcp |
debug | bool | False | Debug mode |
Connect MCP Client
Claude Desktop (claude_desktop_config.json):
{
"mcpServers": {
"praisonai-agent": {
"url": "http://localhost:8080/sse"
}
}
}
Cursor (.cursor/mcp.json):
{
"mcpServers": {
"praisonai-agent": {
"url": "http://localhost:8080/sse"
}
}
}
# List tools
curl -X POST http://localhost:8080/messages/ \
-H "Content-Type: application/json" \
-d '{"jsonrpc": "2.0", "method": "tools/list", "id": 1}'
# Call tool
curl -X POST http://localhost:8080/messages/ \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "tools/call",
"params": {"name": "execute_tweetagent_task", "arguments": {"task": "Create a tweet about AI"}},
"id": 2
}'
Docker Deployment
app.py:
from praisonaiagents import Agent
agent = Agent(instructions="Create tweets", llm="gpt-4o-mini")
agent.launch(port=8080, protocol="mcp")
Dockerfile:
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app.py .
EXPOSE 8080
CMD ["python", "app.py"]
requirements.txt:
Build and run:
docker build -t agent-mcp-server .
docker run -p 8080:8080 -e OPENAI_API_KEY=$OPENAI_API_KEY agent-mcp-server
Troubleshooting
| Issue | Fix |
|---|
| Port in use | lsof -i :8080 |
| Missing deps | pip install "praisonaiagents[mcp]" |
| No API key | export OPENAI_API_KEY="your-key" |
| SSE not connecting | Check firewall, use host="0.0.0.0" |