Command Line Interface
Overview
The Boltbase CLI provides all the necessary commands to manage and execute DAGs (workflows) directly from the terminal. It allows you to start, stop, retry, and monitor workflows, as well as manage the underlying scheduler and web server.
Complete Reference
For the full CLI command reference, see CLI Commands Reference.
Basic Usage
boltbase [global options] command [command options] [arguments...]Getting Help
# General help
boltbase --help
# Command-specific help
boltbase start --help
# Show version
boltbase versionEssential Commands
Running Workflows
Start a Workflow
# Basic execution
boltbase start my-workflow.yaml
# Interactive DAG selection (when no file is specified)
boltbase start
# With named parameters (use -- separator)
boltbase start etl.yaml -- DATE=2024-01-01 ENV=prod
# With positional parameters
boltbase start my-workflow.yaml -- value1 value2 value3
# Override DAG name
boltbase start --name my_custom_name my-workflow.yaml
# Queue for later
boltbase enqueue my-workflow.yaml
# Remove a queued run (by queue name)
boltbase dequeue defaultStop a Running Workflow
# Stop currently running workflow
boltbase stop my-workflow
# Stop specific run
boltbase stop --run-id=20240101_120000 my-workflow
# Can also use file path
boltbase stop my-workflow.yamlRestart a Workflow
# Restart latest run
boltbase restart my-workflow
# Restart specific run
boltbase restart --run-id=20240101_120000 my-workflowRetry Failed Workflow
# Retry specific run (run-id is required)
boltbase retry --run-id=20240101_120000 my-workflow
# Can also use file path
boltbase retry --run-id=20240101_120000 my-workflow.yamlMonitoring Workflows
Check Status
# Check latest run status
boltbase status my-workflow
# Check specific run status
boltbase status --run-id=20240101_120000 my-workflow
# Can also use file path
boltbase status my-workflow.yamlView Status of a DAG run
# Check detailed status and output
boltbase status my-workflow.yaml
# Note: For detailed logs, use the web UI at http://localhost:8080
# or check log files in the configured log directoryView Execution History
The history command displays past DAG executions with filtering and export capabilities:
# View recent runs
boltbase history my-workflow
# Debug recent failures
boltbase history my-workflow --status failed --last 7d
# Export to JSON for analysis
boltbase history --format json --limit 500 > history.json
# Export to CSV for spreadsheets
boltbase history --format csv > history.csv
# Filter by tags (AND logic)
boltbase history --tags "prod,critical"Key features:
- Default: last 30 days, 100 results
- Date filters: absolute (
--from/--to) or relative (--last 7d) - Status filters:
succeeded,failed,running, etc. (with aliases) - Output: table (default), JSON, or CSV
- Run IDs never truncated
See history reference for all options.
Testing and Validation
Validate DAG Specification
# Validate DAG structure and references
boltbase validate my-workflow.yaml
# Returns human-readable validation errors if anyDry Run
# Test DAG execution without running it
boltbase dry my-workflow.yaml
# With parameters
boltbase dry my-workflow.yaml -- DATE=2024-01-01
# Override DAG name
boltbase dry --name my_custom_name my-workflow.yamlServer Commands
Start Everything
# Start scheduler, web UI, and coordinator service (default: localhost:8080)
boltbase start-all
# Custom host and port
boltbase start-all --host=0.0.0.0 --port=9000
# Custom DAGs directory
boltbase start-all --dags=/path/to/directoryStart Scheduler Only
# Run just the scheduler (no UI)
boltbase scheduler
# Custom DAGs directory
boltbase scheduler --dags=/opt/workflowsStart Web UI Only
# Run just the web server (no scheduler)
boltbase server
# Custom host and port
boltbase server --host=0.0.0.0 --port=9000
# Custom DAGs directory
boltbase server --dags=/path/to/directoryDistributed Execution Commands
Start Coordinator
# Start the coordinator gRPC server
boltbase coordinator
# Custom host and port
boltbase coordinator --coordinator.host=0.0.0.0 --coordinator.port=50055
# With TLS
boltbase coordinator \
--peer.cert-file=server.pem \
--peer.key-file=server-key.pemThe coordinator service manages task distribution to workers for distributed execution with automatic service registry and health monitoring.
Start Worker
# Start a worker that polls for tasks
boltbase worker
# With labels for capability matching
boltbase worker --worker.labels gpu=true,memory=64G,region=us-east-1
# With custom worker ID and concurrency
boltbase worker \
--worker.id=gpu-worker-01 \
--worker.max-active-runs=50Workers automatically register in the service registry system and poll the coordinator for matching tasks based on their labels.
Interactive DAG Selection
When you run boltbase start without specifying a DAG file, an interactive selector appears:
boltbase startFeatures:
- Browse available DAGs with filtering
- Enter parameters interactively
- Confirm before execution
Advanced Usage
Queue Management
# Add to queue
boltbase enqueue my-workflow.yaml
# Add to queue with custom ID
boltbase enqueue --run-id=custom-001 my-workflow.yaml
# Add to queue with parameters
boltbase enqueue my-workflow.yaml -- KEY=value
# Add to queue using a specific queue (override)
boltbase enqueue --queue=high-priority my-workflow.yaml
# Override DAG name
boltbase enqueue --name my_custom_name my-workflow.yaml
# Remove next item from queue
boltbase dequeue default
# Remove specific run from queue
boltbase dequeue default --dag-run=my-workflow:custom-001Working with Parameters
Parameters can be passed in multiple ways:
# Positional parameters (use -- separator)
boltbase start my-workflow.yaml -- param1 param2 param3
# Named parameters (use -- separator)
boltbase start my-workflow.yaml -- KEY1=value1 KEY2=value2
# Mixed (use -- separator)
boltbase start my-workflow.yaml -- param1 KEY=value param2CLI Configuration
Global Options
| Option | Description | Default |
|---|---|---|
--config | Config file path | ~/.config/boltbase/config.yaml |
--log-level | Log verbosity | info |
--log-format | Output format | text |
--quiet | Suppress output | false |
See Also
- Explore the REST API for programmatic access
- Set up the Web UI for visual monitoring
- Learn workflow syntax to build complex DAGs
- Configure distributed execution for scaling workflows
