CLI
The BoringData CLI is a tool for generating boilerplate code and adding integration/pipelines to your stack.
This document provides a comprehensive overview of all available commands.
Installation
uv tool install git+ssh://[email protected]/boringdata/boringdata-cli.git --python 3.12uv tool install https://github.com/boringdata/boringdata-cli.git --python 3.12Table of Contents
AWS Commands
Commands for managing AWS resources with BoringData.
aws bucket
aws bucketCreate a new S3 bucket configuration with versioning and encryption.
boringdata aws bucket <bucket-name> [--output-folder <path>]Arguments:
bucket-name: Name of the S3 bucket to create (required)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata aws bucket my-data-bucket
boringdata aws bucket my-data-bucket --output-folder pipelines/Files Created:
.
└── <bucket_name>_bucket.tf # Main bucket configurationaws lambda
aws lambdaCreate a new AWS Lambda function with optional triggers.
boringdata aws lambda <lambda-name> [--output-folder <path>] [--trigger <triggers>]Arguments:
lambda-name: Name of the Lambda function to create (required)--output-folder: Directory where files will be created (default: current directory)--trigger: Comma-separated list of triggers to enable. Options: 'sqs', 'cron'
Example:
boringdata aws lambda my-function
boringdata aws lambda my-function --trigger sqs,cron
boringdata aws lambda my-function --output-folder ./infrastructureFiles Created:
.
├── <lambda_name>_lambda.tf # Lambda function configuration
└── <lambda_name>-lambda/ # Lambda function code
├── .env.example # Environment variables template
├── .gitignore # Git ignore file
├── Dockerfile # Lambda container definition
├── requirements.txt # Python dependencies
└── lambda_handler.py # Lambda function codeaws step-function
aws step-functionCreate a new AWS Step Function configuration for orchestration.
boringdata aws step-function <type> [--source-name <name>] [--dbt-command <command>] [--output-folder <path>]Arguments:
type: Type of step function to create (required). Options: lambda-dbt--source-name: Name of the source Lambda/ECS task (required for lambda-dbt and ecs-dbt types)--dbt-command: DBT command to execute (default: "run")--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata aws step-function lambda-dbt --source-name my-source --dbt-command "run --select tag:daily"Files Created:
.
├── <source_name>_step_function.tf # Step function configuration
└── orchestrate/ # Step function definitions
└── <source_name>_step_function.json # Step function state machineDBT Commands
Commands for managing dbt projects with BoringData.
dbt init
dbt initInitialize a new dbt project with configuration.
boringdata dbt init [--output-folder <path>] [--target <type>]Arguments:
--output-folder: Directory where files will be created (default: current directory)--target: Target of the project: snowflake or athena (default: snowflake)
Example:
boringdata dbt init
boringdata dbt init --output-folder ./transform --target athenaFiles Created:
.
├── ecs_task_dbt.tf # ECS task definition for dbt
└── transform/ # dbt project directory
├── ...
├── Dockerfile # Container definition for dbt
├── Makefile # Common dbt commands
├── dbt_project.yml # dbt project configuration
├── macros/ # Custom macros
├── sources/ # Source models (bronze)
└── models/ # dbt models
├── marts/ # Business-layer models (gold)
└── staging/ # Staging-layer models (silver)dbt import-source
dbt import-sourceImport sources and generate corresponding dbt models.
boringdata dbt import-source --source <path> [--output-folder <path>] [--schema-name <name>] [--target <type>]Arguments:
--source: Path to the source YAML file or folder <source_name>-schema (required)--output-folder: Directory where files will be created (default: current directory)--schema-name: Name of the DB schema where sources are stored (default: LANDING)--target: Target of the dbt project (snowflake, athena)
Example:
boringdata dbt import-source --source ./sources/my_source.yml
boringdata dbt import-source --source ./sources/my_source-schema --target athenaFiles Created:
.
├── sources/ # Source definitions
│ └── <source_name>.yml # Source configuration
└── models/ # Generated models
└── staging/ # Staging models
└── <source_name>/ # Source-specific models
├── stg_<source_name>_<model_name>.sql # Generated staging models
└── ... # Additional models as neededDLT Commands
Commands for managing DLT pipelines with BoringData.
dlt add-source
dlt add-sourceAdd a new DLT source using Lambda function.
boringdata dlt add-source <connector-name> [--source-name <name>] [--destination <type>] [--output-folder <path>]Arguments:
connector-name: Name of the DLT connector to use (required)--source-name: Name of your source, defaults to connector_name--destination: Destination to use for the lambda (s3 or iceberg) (default: s3)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata dlt add-source chess
boringdata dlt add-source chess --source-name my_chess --destination iceberg
boringdata dlt add-source chess --output-folder ./pipelinesFiles Created:
.
├── <source_name>_lambda.tf # Lambda function configuration
└── ingest/ # Lambda function code
└── <source_name>-ingestion/ # Source-specific Lambda
├── .dlt/ # DLT configuration
├── .env.example # Environment variables template
├── .env.local # Local environment variables
├── Dockerfile # Lambda container definition
├── Makefile # Common commands
├── lambda_handler.py # Lambda function code
├── requirements.txt # Python dependencies
└── requirements-dev.txt # Development dependenciesdlt get-schema
dlt get-schemaGet the destination table schema of the DLT pipeline.
boringdata dlt get-schema <source-name> [--engine <type>] [--target <format>] [--output-folder <path>]Arguments:
source-name: Name of your source (required)--engine: Target typing format: arrow or snowflake (default: arrow)--target: Target output format: yaml or pyiceberg (default: yaml)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata dlt get-schema my-source
boringdata dlt get-schema my-source --engine snowflake --target yaml
boringdata dlt get-schema my-source --engine arrow --target pyiceberg
boringdata dlt get-schema my-source --output-folder ./schemasFiles Created (for yaml target):
.
└── <source_name>_source_schema.yml # Data contract for the DLT pipelineFiles Created (for pyiceberg target):
.
└── <source_name>-schema/ # Schema migration directory
├── Makefile # Makefile to run schema migration
└── <source_name>_<table_name>.py # Migration script for each tableSnowflake Commands
Commands for managing Snowflake resources with BoringData.
snowflake tf-module
snowflake tf-moduleGenerate a Terraform Snowflake module for database and warehouse management.
boringdata snowflake tf-module [--output-folder <path>]Arguments:
--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata snowflake tf-module
boringdata snowflake tf-module --output-folder ./modules/snowflakeFiles Created:
.
├── data.tf # Data sources
├── db.tf # Database configuration
├── locals.tf # Local variables
├── schema.tf # Schema configuration
├── tech_user.tf # Technical user configuration
├── versions.tf # Provider and terraform versions
└── warehouse.tf # Warehouse configurationProject Commands
Commands for managing project-level resources with BoringData.
project init
project initInitialize a new BoringData project with infrastructure as code.
boringdata project init <project-type> [--local-state] [--output-folder <path>]Arguments:
project-type: Type of project to create (required). Options: 'aws', 'aws-snowflake'--local-state: Use local state instead of remote state in S3 (default: remote S3 state)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata project init aws
boringdata project init aws-snowflake --local-state
boringdata project init aws --output-folder ./my-projectFiles Created:
.
├── README.md # Project documentation
├── Makefile # Common commands
├── .gitignore # Git ignore file
├── base/ # Base infrastructure
│ └── aws/ # AWS-specific base
│ └── main.tf # Main configuration
│ └── snowflake/ # Snowflake-specific base
│ └── main.tf # Main configuration
├── pipelines/ # Data pipeline infrastructure
│ └── infra/ # Infrastructure configuration
│ └── aws/ # AWS pipeline resources
│ └── main.tf # Main configuration
└── live/ # Terragrunt configurations
├── root.hcl # Root Terragrunt configuration
├── base/ # Base infrastructure live configs
│ ├── aws/ # AWS base live config
│ │ └── terragrunt.hcl # Terragrunt configuration
│ └── snowflake/ # Snowflake base live config
│ └── terragrunt.hcl # Terragrunt configuration
└── pipelines/ # Pipeline live configs
└── infra/ # Infrastructure live configs
└── aws/ # AWS pipeline live config
└── terragrunt.hcl # Terragrunt configurationTerragrunt Commands
Commands for managing Terragrunt configurations with BoringData.
terragrunt init
terragrunt initInitialize a new Terragrunt directory by generating the root.hcl file.
boringdata terragrunt init [--local-state] [--output-folder <path>]Arguments:
--local-state: Use local state instead of remote state in S3 (default: False)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata terragrunt init
boringdata terragrunt init --local-state
boringdata terragrunt init --output-folder live/Files Created:
.
└── root.hcl # Root Terragrunt configurationterragrunt add-module
terragrunt add-moduleAdd a new Terragrunt module configuration.
boringdata terragrunt add-module --module-path <path> [--output-folder <path>]Arguments:
--module-path: Relative path to the module from --output-folder (required)--output-folder: Directory where terragrunt.hcl file will be created (default: current directory)
Example:
boringdata terragrunt add-module --module-path ../../../base/aws --output-folder live/base/awsFiles Created:
.
└── terragrunt.hcl # Module Terragrunt configurationGitHub Commands
Commands for managing GitHub workflows with BoringData.
github init
github initInitialize GitHub workflows for CI/CD.
boringdata github init [--template-type <type>] [--output-folder <path>]Arguments:
--template-type: Type of template to use (aws or aws-snowflake) (default: aws)--output-folder: Directory where files will be created (default: current directory)
Example:
boringdata github init
boringdata github init --template-type aws-snowflake
boringdata github init --output-folder ./my-projectFiles Created:
.
└── .github/
└── workflows/
└── ci.yml # CI workflowLast updated