Skip to content

CLI Reference

The FlowScope CLI analyzes SQL files from the command line, producing lineage in multiple output formats. It also includes a serve mode that runs an embedded web UI with file watching.

Terminal window
# Using Cargo
cargo install flowscope-cli
# Or download from releases
# https://github.com/pondpilot/flowscope/releases
Terminal window
# Analyze a single file
flowscope query.sql
# Analyze multiple files
flowscope models/*.sql
# Read from stdin
cat query.sql | flowscope
echo "SELECT * FROM orders" | flowscope
Terminal window
-d, --dialect <DIALECT>

SQL dialect for parsing. Options: generic, ansi, postgresql, snowflake, bigquery, duckdb, redshift, mysql, mssql, sqlite, hive, databricks, clickhouse

Terminal window
flowscope -d snowflake warehouse/*.sql
flowscope --dialect bigquery analytics.sql
Terminal window
-f, --format <FORMAT>

Output format. Options: table, json, mermaid, html, sql, csv, xlsx, duckdb

Terminal window
# Default: table format
flowscope query.sql
# JSON output
flowscope -f json query.sql > lineage.json
# Mermaid diagram
flowscope -f mermaid query.sql > lineage.mmd
# Excel workbook
flowscope -f xlsx -o report.xlsx queries/*.sql
# Interactive HTML report
flowscope -f html -o report.html query.sql
Terminal window
-v, --view <VIEW>

Lineage view mode. Options: table, column, script, hybrid

Terminal window
# Table-level lineage only
flowscope -v table query.sql
# Column-level lineage
flowscope -v column query.sql
# Script/file-level view
flowscope -v script models/*.sql
Terminal window
-o, --output <FILE>

Write output to file instead of stdout.

Terminal window
flowscope -f json -o lineage.json query.sql
flowscope -f xlsx -o report.xlsx queries/*.sql
Terminal window
-s, --schema <FILE>

Provide DDL file for column resolution.

Terminal window
flowscope -s schema.sql -v column query.sql
Terminal window
--metadata-url <URL>

Fetch schema from a live database. Supports PostgreSQL, MySQL, and SQLite.

Terminal window
# PostgreSQL
flowscope --metadata-url postgres://user:pass@localhost/db query.sql
# MySQL
flowscope --metadata-url mysql://user:pass@localhost/db query.sql
# SQLite
flowscope --metadata-url sqlite:///path/to/database.db query.sql
Terminal window
--template <MODE>

Enable template preprocessing. Options: raw, jinja, dbt

Terminal window
flowscope --template dbt models/*.sql
Terminal window
--template-var <KEY=VALUE>

Set template variables for Jinja/dbt preprocessing.

Terminal window
flowscope --template dbt \
--template-var target_schema=production \
--template-var run_date=2024-01-01 \
models/*.sql

Run an embedded web UI with file watching:

Terminal window
flowscope --serve [OPTIONS] [PATH]
OptionDescription
--port <PORT>HTTP port (default: 8080)
--watch <DIR>Directory to watch for changes
--openOpen browser automatically
Terminal window
# Start server watching current directory
flowscope --serve --watch . --open
# Custom port
flowscope --serve --port 3000 --watch ./sql
# Watch multiple directories
flowscope --serve --watch models --watch macros

When running in serve mode:

EndpointMethodDescription
/api/healthGETHealth check with version
/api/analyzePOSTRun lineage analysis
/api/completionPOSTCode completion
/api/splitPOSTSplit SQL into statements
/api/filesGETList watched files
/api/schemaGETGet schema metadata
/api/export/:formatPOSTExport to format
/api/configGETServer configuration

Human-readable table output:

┌─────────────────────────────────────────────────────────────┐
│ Statement 1: SELECT │
├─────────────────────────────────────────────────────────────┤
│ Sources: orders, customers │
│ Target: (query result) │
│ │
│ Column Lineage: │
│ customer_name ← customers.name │
│ total_amount ← SUM(orders.amount) │
└─────────────────────────────────────────────────────────────┘

Structured JSON for programmatic use:

{
"statements": [
{
"sql": "SELECT ...",
"nodes": [...],
"edges": [...],
"issues": []
}
],
"summary": {
"tables": 2,
"columns": 4,
"edges": 6
}
}

Diagram syntax for documentation:

graph LR
orders[orders]
customers[customers]
output[Query Result]
orders --> output
customers --> output

Self-contained HTML with interactive visualization. Includes the full React graph component.

ZIP archive containing:

  • scripts.csv - SQL statements
  • tables.csv - Tables and columns
  • edges.csv - Lineage relationships
  • issues.csv - Validation issues

Workbook with sheets for summary, lineage details, and issues.

DuckDB-compatible DDL and INSERT statements for loading lineage into a database.

Native DuckDB database file (requires native build, not WASM).

Terminal window
# Analyze and display
flowscope query.sql
# With column-level detail
flowscope -v column query.sql
Terminal window
# Mermaid for markdown docs
flowscope -f mermaid -v table etl.sql > docs/lineage.mmd
# HTML report for sharing
flowscope -f html -o lineage-report.html queries/*.sql
Terminal window
# Analyze dbt models
flowscope --template dbt \
-d snowflake \
-v column \
models/**/*.sql
# With variables
flowscope --template dbt \
--template-var target_schema=analytics \
--template-var env=prod \
-f html -o lineage.html \
models/*.sql
Terminal window
# JSON output for parsing
flowscope -f json query.sql | jq '.summary'
# Check for issues
flowscope -f json query.sql | jq '.statements[].issues | length'
Terminal window
# Start development server
flowscope --serve --watch ./sql --port 8080 --open
# Server logs
# Listening on http://localhost:8080
# Watching: ./sql
# Press Ctrl+C to stop
CodeMeaning
0Success
1Parse error or invalid input
2File not found or I/O error
3Invalid options