CI Deployment

This guide outlines the essential steps for deploying the AWS + Snowflake Data Stack Template in a production environment.

1. Create Dedicated Users

A best practice when deploying with Terraform is to create dedicated credentials that Terraform will use during the deployment.

Terraform should only use these users and have the minimal rights required.

For both AWS and Snowflake, we provide ready-to-use policies and scripts to create these users with your admin account quickly.

You will notice that these users are environment-specific. Each Terraform can only deploy to one environment. What is an "environment" ?

For AWS:

cd init/
export AWS_PROFILE=<YOUR AWS ADMIN PROFILE>
make create-tf-user-aws env=<environemnt> aws_region=<aws_region>

This script will:

  • create a new user called <ENVIRONMENT>_AWS_SF_ADMIN

  • assign him this policy

  • create files .env.<environment>.secrets and .env.<environment>.variables

For Snowflake:

cd init/
export SNOWFLAKE_PROFILE=<YOUR SNOWFLAKE ADMIN PROFILE>
make create-tf-user-snowflake env=<environemnt> aws_region=<aws_region>

This script will:

  • Add credentials to .env.<environment>.secrets and .env.<environment>.variables . The account name is directly parsed from your SNOWFLAKE_PROFILE.

  • Create a new SQL script init_snowflake_tf_user_<environment>.sql. This script creates the user <ENVIRONMENT>_AWS_SF_ADMIN

  • The script will be executed if SnowSQL is installed and set up (with a SnowSQL connection having the same name as your profile). If not, you can run it directly in your Snowflake console.

2. CI/CD Pipeline Setup

The default version of the template does not contain a CICD.

To add it, run:

# Remove the boringdata's internal test workflow
rm .github/workflows/boringdata-test.yml

# Initialize GitHub workflows for CI/CD
# This will create a .github/workflows/ci.yml file with AWS deployment configuration
uvx boringdata github init --template-type aws-snowflake

# Initialize Terragrunt configuration to use S3 remote state
# This will create/update the root.hcl file in the live/ directory
uvx boringdata terragrunt init --output-folder live

This will:

  • add a ready-to-use GitHub Actions workflow in the .github/workflows folder

  • Update the Terragrunt configuration to point to the S3 bucket

The GitHub Actions workflow requires AWS and Snowflake credentials to deploy the project.

You must, therefore, create the necessary variables and secrets in your GitHub repository.

If you have the GitHub CLI installed and are authorized for your repository, run the following commands from the project root:

cd init/
make github-ci-setup repo=<github account>/<repo> env=<your environemnt>

This command will automatically create the required variables in GitHub based on your AWS and Snowflake profiles based on the .env files created previously.

Alternatively, you can manually set them up in the GitHub console.

That's it; you are now ready to deploy.

3. Set Up Terraform State S3 Bucket

You must use a dedicated S3 bucket to store the Terraform state for production deployment.

To create the bucket, run the following command:

cd init/
export AWS_PROFILE=<your-aws-profile>
make create-tf-bucket env=<environemnt> aws_region=<aws_region>

This command will:

  • Create a new S3 bucket named <environment>-<aws-region>-terraform-state-bucket.

  • Configure the appropriate bucket policies and enable encryption and versioning.

4. Deployment

The CI pipeline runs on every merge to the main branch and deploys to the environment defined in the variables.

The CI pipeline will start automatically once you push your changes to the repository.

GitHub CI
GitHub CI

The CI pipeline consists of two jobs:

  • Terragrunt-apply: Deploys the infrastructure using Terragrunt

  • Deploy-dockers: Builds and deploys Docker containers

The deploy-dockers job executes make deploy in all pipelines/ingest and pipelines/transform folders that contain a Dockerfile.

Only the folders with changes will be processed if the CI pipeline runs after a merge.

All Docker images will be built and deployed when the workflow is run from scratch.

CI Docker Build
CI Docker Build Process

Verify the Deployment

After deployment is complete, verify the setup in your AWS console:

  1. Navigate to the AWS Step Functions service

  2. Locate your pipeline's step function (e.g., prod-chess-step-function)

  3. Execute the step function with an empty payload

  4. Monitor the execution to ensure the pipeline runs successfully

Last updated