The BoringData CLI is a tool for generating boilerplate code and adding integration/pipelines to your stack.
This document provides a comprehensive overview of all available commands.
Installation
SSH GitHub auth HTTPS GitHub auth
Copy uv tool install git+ssh://[email protected] /boringdata/boringdata-cli.git --python 3.12
Copy uv tool install https://github.com/boringdata/boringdata-cli.git --python 3.12
Table of Contents
AWS Commands
Commands for managing AWS resources with BoringData.
aws bucket
Create a new S3 bucket configuration with versioning and encryption.
Copy boringdata aws bucket <bucket-name> [--output-folder <path>]
Arguments:
bucket-name
: Name of the S3 bucket to create (required)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata aws bucket my-data-bucket
boringdata aws bucket my-data-bucket --output-folder pipelines/
Files Created:
Copy .
└── <bucket_name>_bucket.tf # Main bucket configuration
aws lambda
Create a new AWS Lambda function with optional triggers.
Copy boringdata aws lambda <lambda-name> [--output-folder <path>] [--trigger <triggers>]
Arguments:
lambda-name
: Name of the Lambda function to create (required)
--output-folder
: Directory where files will be created (default: current directory)
--trigger
: Comma-separated list of triggers to enable. Options: 'sqs', 'cron'
Example:
Copy boringdata aws lambda my-function
boringdata aws lambda my-function --trigger sqs,cron
boringdata aws lambda my-function --output-folder ./infrastructure
Files Created:
Copy .
├── <lambda_name>_lambda.tf # Lambda function configuration
└── <lambda_name>-lambda/ # Lambda function code
├── .env.example # Environment variables template
├── .gitignore # Git ignore file
├── Dockerfile # Lambda container definition
├── requirements.txt # Python dependencies
└── lambda_handler.py # Lambda function code
aws step-function
Create a new AWS Step Function configuration for orchestration.
Copy boringdata aws step-function <type> [--source-name <name>] [--dbt-command <command>] [--output-folder <path>]
Arguments:
type
: Type of step function to create (required). Options: lambda-dbt
--source-name
: Name of the source Lambda/ECS task (required for lambda-dbt and ecs-dbt types)
--dbt-command
: DBT command to execute (default: "run")
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata aws step-function lambda-dbt --source-name my-source --dbt-command "run --select tag:daily"
Files Created:
Copy .
├── <source_name>_step_function.tf # Step function configuration
└── orchestrate/ # Step function definitions
└── <source_name>_step_function.json # Step function state machine
DBT Commands
Commands for managing dbt projects with BoringData.
dbt init
Initialize a new dbt project with configuration.
Copy boringdata dbt init [--output-folder <path>] [--target <type>]
Arguments:
--output-folder
: Directory where files will be created (default: current directory)
--target
: Target of the project: snowflake or athena (default: snowflake)
Example:
Copy boringdata dbt init
boringdata dbt init --output-folder ./transform --target athena
Files Created:
Copy .
├── ecs_task_dbt.tf # ECS task definition for dbt
└── transform/ # dbt project directory
├── ...
├── Dockerfile # Container definition for dbt
├── Makefile # Common dbt commands
├── dbt_project.yml # dbt project configuration
├── macros/ # Custom macros
├── sources/ # Source models (bronze)
└── models/ # dbt models
├── marts/ # Business-layer models (gold)
└── staging/ # Staging-layer models (silver)
dbt import-source
Import sources and generate corresponding dbt models.
Copy boringdata dbt import-source --source <path> [--output-folder <path>] [--schema-name <name>] [--target <type>]
Arguments:
--source
: Path to the source YAML file or folder <source_name>-schema (required)
--output-folder
: Directory where files will be created (default: current directory)
--schema-name
: Name of the DB schema where sources are stored (default: LANDING)
--target
: Target of the dbt project (snowflake, athena)
Example:
Copy boringdata dbt import-source --source ./sources/my_source.yml
boringdata dbt import-source --source ./sources/my_source-schema --target athena
Files Created:
Copy .
├── sources/ # Source definitions
│ └── <source_name>.yml # Source configuration
└── models/ # Generated models
└── staging/ # Staging models
└── <source_name>/ # Source-specific models
├── stg_<source_name>_<model_name>.sql # Generated staging models
└── ... # Additional models as needed
DLT Commands
Commands for managing DLT pipelines with BoringData.
dlt add-source
Add a new DLT source using Lambda function.
Copy boringdata dlt add-source <connector-name> [--source-name <name>] [--destination <type>] [--output-folder <path>]
Arguments:
connector-name
: Name of the DLT connector to use (required)
--source-name
: Name of your source, defaults to connector_name
--destination
: Destination to use for the lambda (s3 or iceberg) (default: s3)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata dlt add-source chess
boringdata dlt add-source chess --source-name my_chess --destination iceberg
boringdata dlt add-source chess --output-folder ./pipelines
Files Created:
Copy .
├── <source_name>_lambda.tf # Lambda function configuration
└── ingest/ # Lambda function code
└── <source_name>-ingestion/ # Source-specific Lambda
├── .dlt/ # DLT configuration
├── .env.example # Environment variables template
├── .env.local # Local environment variables
├── Dockerfile # Lambda container definition
├── Makefile # Common commands
├── lambda_handler.py # Lambda function code
├── requirements.txt # Python dependencies
└── requirements-dev.txt # Development dependencies
dlt get-schema
Get the destination table schema of the DLT pipeline.
Copy boringdata dlt get-schema <source-name> [--engine <type>] [--target <format>] [--output-folder <path>]
Arguments:
source-name
: Name of your source (required)
--engine
: Target typing format: arrow or snowflake (default: arrow)
--target
: Target output format: yaml or pyiceberg (default: yaml)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata dlt get-schema my-source
boringdata dlt get-schema my-source --engine snowflake --target yaml
boringdata dlt get-schema my-source --engine arrow --target pyiceberg
boringdata dlt get-schema my-source --output-folder ./schemas
Files Created (for yaml target):
Copy .
└── <source_name>_source_schema.yml # Data contract for the DLT pipeline
Files Created (for pyiceberg target):
Copy .
└── <source_name>-schema/ # Schema migration directory
├── Makefile # Makefile to run schema migration
└── <source_name>_<table_name>.py # Migration script for each table
Snowflake Commands
Commands for managing Snowflake resources with BoringData.
snowflake tf-module
Generate a Terraform Snowflake module for database and warehouse management.
Copy boringdata snowflake tf-module [--output-folder <path>]
Arguments:
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata snowflake tf-module
boringdata snowflake tf-module --output-folder ./modules/snowflake
Files Created:
Copy .
├── data.tf # Data sources
├── db.tf # Database configuration
├── locals.tf # Local variables
├── schema.tf # Schema configuration
├── tech_user.tf # Technical user configuration
├── versions.tf # Provider and terraform versions
└── warehouse.tf # Warehouse configuration
Project Commands
Commands for managing project-level resources with BoringData.
project init
Initialize a new BoringData project with infrastructure as code.
Copy boringdata project init <project-type> [--local-state] [--output-folder <path>]
Arguments:
project-type
: Type of project to create (required). Options: 'aws', 'aws-snowflake'
--local-state
: Use local state instead of remote state in S3 (default: remote S3 state)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata project init aws
boringdata project init aws-snowflake --local-state
boringdata project init aws --output-folder ./my-project
Files Created:
Copy .
├── README.md # Project documentation
├── Makefile # Common commands
├── .gitignore # Git ignore file
├── base/ # Base infrastructure
│ └── aws/ # AWS-specific base
│ └── main.tf # Main configuration
│ └── snowflake/ # Snowflake-specific base
│ └── main.tf # Main configuration
├── pipelines/ # Data pipeline infrastructure
│ └── infra/ # Infrastructure configuration
│ └── aws/ # AWS pipeline resources
│ └── main.tf # Main configuration
└── live/ # Terragrunt configurations
├── root.hcl # Root Terragrunt configuration
├── base/ # Base infrastructure live configs
│ ├── aws/ # AWS base live config
│ │ └── terragrunt.hcl # Terragrunt configuration
│ └── snowflake/ # Snowflake base live config
│ └── terragrunt.hcl # Terragrunt configuration
└── pipelines/ # Pipeline live configs
└── infra/ # Infrastructure live configs
└── aws/ # AWS pipeline live config
└── terragrunt.hcl # Terragrunt configuration
Terragrunt Commands
Commands for managing Terragrunt configurations with BoringData.
terragrunt init
Initialize a new Terragrunt directory by generating the root.hcl file.
Copy boringdata terragrunt init [--local-state] [--output-folder <path>]
Arguments:
--local-state
: Use local state instead of remote state in S3 (default: False)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata terragrunt init
boringdata terragrunt init --local-state
boringdata terragrunt init --output-folder live/
Files Created:
Copy .
└── root.hcl # Root Terragrunt configuration
terragrunt add-module
Add a new Terragrunt module configuration.
Copy boringdata terragrunt add-module --module-path <path> [--output-folder <path>]
Arguments:
--module-path
: Relative path to the module from --output-folder (required)
--output-folder
: Directory where terragrunt.hcl file will be created (default: current directory)
Example:
Copy boringdata terragrunt add-module --module-path ../../../base/aws --output-folder live/base/aws
Files Created:
Copy .
└── terragrunt.hcl # Module Terragrunt configuration
GitHub Commands
Commands for managing GitHub workflows with BoringData.
github init
Initialize GitHub workflows for CI/CD.
Copy boringdata github init [--template-type <type>] [--output-folder <path>]
Arguments:
--template-type
: Type of template to use (aws or aws-snowflake) (default: aws)
--output-folder
: Directory where files will be created (default: current directory)
Example:
Copy boringdata github init
boringdata github init --template-type aws-snowflake
boringdata github init --output-folder ./my-project
Files Created:
Copy .
└── .github/
└── workflows/
└── ci.yml # CI workflow