Skip to main content
Wikipedia research agent with search, page retrieval, and summarization tools.

Simple

Agents: 1 — Single agent with Wikipedia tools handles search and content extraction.

Workflow

  1. Receive knowledge query
  2. Search Wikipedia articles
  3. Summarize findings

Setup

pip install praisonaiagents praisonai wikipedia
export OPENAI_API_KEY="your-key"

Run — Python

from praisonaiagents import Agent
from praisonaiagents import wiki_search, wiki_summary, wiki_page

agent = Agent(
    name="WikiResearcher",
    instructions="Search and summarize Wikipedia content.",
    tools=[wiki_search, wiki_summary, wiki_page]
)

result = agent.start("What is the history of artificial intelligence?")
print(result)

Run — CLI

praisonai "Explain quantum computing from Wikipedia" --tools wikipedia

Run — agents.yaml

framework: praisonai
topic: Wikipedia Research
roles:
  wiki_researcher:
    role: Wikipedia Research Specialist
    goal: Extract and summarize Wikipedia content
    backstory: You are an expert at finding knowledge
    tools:
      - wiki_search
      - wiki_summary
      - wiki_page
    tasks:
      research:
        description: What is the history of artificial intelligence?
        expected_output: A comprehensive summary
praisonai agents.yaml

Serve API

from praisonaiagents import Agent
from praisonaiagents import wiki_search, wiki_summary, wiki_page

agent = Agent(
    name="WikiResearcher",
    instructions="You are a Wikipedia research agent.",
    tools=[wiki_search, wiki_summary, wiki_page]
)

agent.launch(port=8080)
curl -X POST http://localhost:8080/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Tell me about the Roman Empire"}'

Advanced Workflow (All Features)

Agents: 1 — Single agent with memory, persistence, structured output, and session resumability.

Workflow

  1. Initialize session for knowledge tracking
  2. Configure SQLite persistence for research history
  3. Search and extract with structured output
  4. Store findings in memory for follow-up queries
  5. Resume session for continued research

Setup

pip install praisonaiagents praisonai wikipedia pydantic
export OPENAI_API_KEY="your-key"

Run — Python

from praisonaiagents import Agent, Task, Agents, Session
from praisonaiagents import wiki_search, wiki_summary, wiki_page
from pydantic import BaseModel

class WikiKnowledge(BaseModel):
    topic: str
    summary: str
    key_facts: list[str]
    related_topics: list[str]

session = Session(session_id="wiki-001", user_id="user-1")

agent = Agent(
    name="WikiResearcher",
    instructions="Extract structured knowledge from Wikipedia.",
    tools=[wiki_search, wiki_summary, wiki_page],
    memory=True
)

task = Task(
    description="What is the history of artificial intelligence?",
    expected_output="Structured knowledge summary",
    agent=agent,
    output_pydantic=WikiKnowledge
)

agents = Agents(
    agents=[agent],
    tasks=[task],
    memory=True
)

result = agents.start()
print(result)

Run — CLI

praisonai "Explain AI history" --tools wikipedia --memory --verbose

Run — agents.yaml

framework: praisonai
topic: Wikipedia Research
memory: true
memory_config:
  provider: sqlite
  db_path: wiki.db
roles:
  wiki_researcher:
    role: Wikipedia Research Specialist
    goal: Extract structured knowledge
    backstory: You are an expert at finding knowledge
    tools:
      - wiki_search
      - wiki_summary
      - wiki_page
    memory: true
    tasks:
      research:
        description: What is the history of artificial intelligence?
        expected_output: Structured knowledge summary
        output_json:
          topic: string
          summary: string
          key_facts: array
          related_topics: array
praisonai agents.yaml --verbose

Serve API

from praisonaiagents import Agent
from praisonaiagents import wiki_search, wiki_summary, wiki_page

agent = Agent(
    name="WikiResearcher",
    instructions="Extract structured knowledge from Wikipedia.",
    tools=[wiki_search, wiki_summary, wiki_page],
    memory=True
)

agent.launch(port=8080)
curl -X POST http://localhost:8080/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "Tell me about Rome", "session_id": "wiki-001"}'

Monitor / Verify

praisonai "test wikipedia" --tools wikipedia --verbose

Cleanup

rm -f wiki.db

Features Demonstrated

FeatureImplementation
WorkflowMulti-tool Wikipedia research
DB PersistenceSQLite via memory_config
Observability--verbose flag
Toolswiki_search, wiki_summary, wiki_page
ResumabilitySession with session_id
Structured OutputPydantic WikiKnowledge model

Next Steps