Skip to main content

Async Jobs CLI

Manage async jobs from the command line. Submit jobs, check status, get results, stream progress, and cancel jobs.

Commands Overview

CommandDescription
praisonai run submitSubmit a new job
praisonai run status <id>Get job status
praisonai run result <id>Get job result
praisonai run stream <id>Stream job progress
praisonai run listList all jobs
praisonai run cancel <id>Cancel a job

Starting the Jobs Server

Before using job commands, start the jobs server:
# Start server on default port (8005)
python -m uvicorn praisonai.jobs.server:create_app --port 8005 --factory

# Or with custom host/port
python -m uvicorn praisonai.jobs.server:create_app --host 0.0.0.0 --port 8080 --factory

Submit a Job

Submit a new job for execution:
# Basic submission with prompt
praisonai run submit "Analyze this data"

# Submit with recipe
praisonai run submit "Analyze AI trends" --recipe news-analyzer

# With recipe config
praisonai run submit "Analyze AI trends" --recipe news-analyzer --recipe-config '{"format": "json"}'

# With agent file
praisonai run submit "Analyze this data" --agent-file agents.yaml

# Wait for completion
praisonai run submit "Quick task" --wait

# Stream progress after submission
praisonai run submit "Long task" --stream

# With timeout
praisonai run submit "Complex task" --timeout 7200

# With webhook
praisonai run submit "Task" --webhook-url https://example.com/callback

# With idempotency
praisonai run submit "Task" --idempotency-key order-123 --idempotency-scope session

# With metadata
praisonai run submit "Task" --metadata user=john --metadata priority=high

# JSON output
praisonai run submit "Task" --json

# Custom API URL
praisonai run submit "Task" --api-url http://localhost:8080

Submit Options

OptionDescription
--agent-filePath to agents.yaml
--recipeRecipe name (mutually exclusive with —agent-file)
--recipe-configRecipe config as JSON string
--frameworkFramework to use (default: praisonai)
--timeoutTimeout in seconds (default: 3600)
--waitWait for completion
--streamStream progress after submission
--idempotency-keyKey to prevent duplicates
--idempotency-scopeScope: none, session, global
--webhook-urlWebhook URL for completion
--session-idSession ID for grouping
--metadataCustom metadata (KEY=VALUE, repeatable)
--jsonOutput JSON for scripting
--api-urlJobs API URL (default: http://127.0.0.1:8005)

Check Job Status

Get the current status of a job:
# Get status
praisonai run status run_abc123

# JSON output
praisonai run status run_abc123 --json

Status Output

Job: run_abc123
Status: running
Progress: 45%
Step: Processing data
Created: 2024-01-15 10:30:00
Duration: 45.2s

Get Job Result

Retrieve the result of a completed job:
# Get result
praisonai run result run_abc123

# JSON output
praisonai run result run_abc123 --json

Stream Job Progress

Stream real-time progress updates via SSE:
# Stream progress
praisonai run stream run_abc123

# Raw JSON events
praisonai run stream run_abc123 --json

Stream Output

[10%] Initializing agent
[20%] Loading recipe
[50%] Processing data
[90%] Finalizing
[100%] Completed

List Jobs

List all jobs with optional filtering:
# List all jobs
praisonai run list

# Filter by status
praisonai run list --status running
praisonai run list --status succeeded
praisonai run list --status failed

# Pagination
praisonai run list --page 1 --page-size 20

# JSON output
praisonai run list --json

List Options

OptionDescription
--statusFilter by status
--pagePage number (default: 1)
--page-sizeJobs per page (default: 20)
--jsonOutput JSON for scripting
--api-urlJobs API URL

Cancel a Job

Cancel a running job:
# Cancel job
praisonai run cancel run_abc123

# JSON output
praisonai run cancel run_abc123 --json

Examples

Complete Workflow

# 1. Start the server (in another terminal)
python -m uvicorn praisonai.jobs.server:create_app --port 8005 --factory

# 2. Submit a job with recipe
praisonai run submit "Analyze AI news" --recipe news-analyzer --json
# Output: {"job_id": "run_abc123", "status": "queued", ...}

# 3. Check status
praisonai run status run_abc123

# 4. Stream progress
praisonai run stream run_abc123

# 5. Get result when done
praisonai run result run_abc123

# 6. List all jobs
praisonai run list

Submit with Wait

# Submit and wait for completion
praisonai run submit "Quick analysis" --recipe quick-analyzer --wait --json

Scripting with JSON

#!/bin/bash

API_URL="http://127.0.0.1:8005"

# Submit job
RESULT=$(praisonai run submit "Analyze data" --recipe analyzer --json --api-url $API_URL)
JOB_ID=$(echo $RESULT | jq -r '.job_id')

echo "Submitted job: $JOB_ID"

# Poll for completion
while true; do
    STATUS=$(praisonai run status $JOB_ID --json --api-url $API_URL | jq -r '.status')
    echo "Status: $STATUS"
    
    if [ "$STATUS" = "succeeded" ] || [ "$STATUS" = "failed" ]; then
        break
    fi
    
    sleep 5
done

# Get result
praisonai run result $JOB_ID --json --api-url $API_URL

Using Webhooks

# Submit with webhook
praisonai run submit "Long running task" \
    --recipe long-task \
    --webhook-url https://example.com/webhook \
    --json

Idempotent Submission

# First submission creates job
praisonai run submit "Process order" --idempotency-key order-123 --json

# Second submission returns same job (no duplicate)
praisonai run submit "Process order" --idempotency-key order-123 --json

Troubleshooting

Connection Refused

If you get “Connection refused”:
  • Ensure the jobs server is running
  • Check the API URL is correct
  • Verify the port is not blocked

Job Stuck in Queued

If jobs remain queued:
  • Check server logs for errors
  • Verify max concurrent limit
  • Ensure recipe/agent file exists

Timeout Errors

If jobs are timing out:
  • Increase timeout with --timeout
  • Check if external APIs are slow
  • Consider breaking into smaller jobs

See Also