1.14.2.2 Multi-Branch Workflows

Teams may make use of multiple branches for their development. For instance, some teams create feature branches while working on new functionality - once this functionality is ready, the branch will be merged into the main branch and the feature branch will be deleted.

While a feature is under development, you'll often want to run tests against the feature branch and possibly deploy to a staging environment. To model this in Concourse, you'll need to have a pipeline for each active feature branch. Manually setting (and eventually archiving) a pipeline for each feature branch would be quite a burden. For this type of workflow, Concourse has a few important tools to help you out: the set_pipeline step, across, and instanced pipelines.

across and instanced pipelines are both experimental features, and must be enabled with the feature flags CONCOURSE_ENABLE_ACROSS_STEP and CONCOURSE_ENABLE_PIPELINE_INSTANCES, respectively.

In this guide, we'll cover:

  1. Writing a pipeline to Test, Build & Deploy a branch to a staging environment. We'll use Terraform for our deployment

  2. Tracking Branches in a repository; for each branch, we'll set a pipeline (using the set_pipeline step and across)

  3. Automatically Cleaning Up Old Workspaces after branches get merged or deleted

Test, Build & Deploy

We'll start out by defining the pipeline that should run for each active branch. For this example, we'll be working with the following sample Go application.

Our pipeline will have three stages:

  1. Run unit tests

  2. Build and upload a binary to a blobstore (in our case, we'll use Google Cloud Storage)

  3. Trigger a terraform apply to deploy our app to a staging environment. The Terraform module we'll use here doesn't actually provision any infrastructure, and is just used as an example

Since the pipeline config is intended to be used as a template for multiple different branches, we can use Vars to parameterize the config. In particular, we'll use the vars ((feature)) and ((branch)), which represent the name of the feature and the name of the branch, respectively.

Below is the full pipeline config:

examples/pipelines/multi-branch/template.yml

resource_types:
- name: terraform
  type: registry-image
  source:
    repository: ljfranklin/terraform-resource

- name: gcs
  type: registry-image
  source:
    repository: frodenas/gcs-resource

resources:
- name: branch
  type: git
  source:
    uri: https://github.com/concourse/examples
    branch: ((branch))

- name: examples
  type: git
  source:
    uri: https://github.com/concourse/examples

- name: build-artifact
  type: gcs
  source:
    bucket: concourse-examples
    json_key: ((gcp_service_account_key))
    regexp: multi-branch/features/((feature))/my-app-(.+)\.tgz

- name: staging-env
  type: terraform
  source:
    env_name: ((feature))
    backend_type: gcs
    backend_config:
      bucket: concourse-examples
      prefix: multi-branch/terraform
      credentials: ((gcp_service_account_key))

jobs:
- name: test
  plan:
  - in_parallel:
    - get: branch
      trigger: true
    - get: examples
  - task: unit
    file: examples/tasks/go-test.yml
    input_mapping: {repo: branch}
    params: {MODULE: apps/golang}

- name: build
  plan:
  - in_parallel:
    - get: branch
      passed: [test]
      trigger: true
    - get: examples
  - task: build
    file: examples/tasks/go-build.yml
    params:
      MODULE: apps/golang
      BINARY_NAME: my-app
    input_mapping: {repo: branch}
  - put: build-artifact
    params: {file: "binary/my-app-*.tgz"}

- name: deploy
  plan:
  - in_parallel:
    - get: build-artifact
      passed: [build]
      trigger: true
    - get: examples
  - load_var: bundle_url
    file: build-artifact/url
  - put: staging-env
    params:
      terraform_source: examples/terraform/staging
      vars: {bundle_url: ((.:bundle_url))}

Tracking Branches

In addition to the branch pipeline template, we'll also need a pipeline to track the list of branches and set a pipeline for each one.

To track the list of branches in a repository, we can use aoldershaw/git-branches-resource. This resource_type emits a new resource version whenever a branch is created or deleted. It also lets us filter the list of branches by a regular expression. In this case, let's assume our feature branches match the regular expression feature/.*.

Below is the full pipeline config for this tracker pipeline:

examples/pipelines/multi-branch/tracker.yml

resource_types:
- name: git-branches
  type: registry-image
  source:
    repository: aoldershaw/git-branches-resource

resources:
- name: feature-branches
  type: git-branches
  source:
    uri: https://github.com/concourse/examples
    # The "(?P<name>pattern)" syntax defines a named capture group.
    # aoldershaw/git-branches-resource emits the value of each named capture
    # group under the `groups` key.
    #
    # e.g. feature/some-feature ==> {"groups": {"feature": "some-feature"}}
    branch_regex: 'feature/(?P<feature>.*)'

- name: examples
  type: git
  source:
    uri: https://github.com/concourse/examples

jobs:
- name: set-feature-pipelines
  plan:
  - in_parallel:
    - get: feature-branches
      trigger: true
    - get: examples
  - load_var: branches
    file: feature-branches/branches.json
  - across:
    - var: branch
      values: ((.:branches))
    set_pipeline: dev
    file: examples/pipelines/multi-branch/template.yml
    instance_vars: {feature: ((.:branch.groups.feature))}
    vars: {branch: ((.:branch.name))}

We set each pipeline as an instanced pipeline - this will result in Concourse grouping all of the related dev pipelines in the UI.

Cleaning Up Old Workspaces

With the setup described in Tracking Branches, Concourse will automatically archive any pipelines for branches that get removed. However, Concourse doesn't know that it should destroy Terraform workspaces when a branch is removed. To accomplish this, we can yet again make use of the Terraform resource to destroy these workspaces. We'll add another job to the tracker pipeline that figures out which workspaces don't belong to an active branch and destroy them.

examples/pipelines/multi-branch/tracker.yml

resource_types:
- name: git-branches
  ...

- name: terraform
  type: registry-image
  source:
    repository: ljfranklin/terraform-resource

resources:
- name: feature-branches
  ...

- name: examples
  ...

- name: staging-env
  type: terraform
  source:
    backend_type: gcs
    backend_config: &terraform_backend_config
      bucket: concourse-examples
      prefix: multi-branch/terraform
      credentials: ((gcp_service_account_key))

jobs:
- name: set-feature-pipelines
  ...

- name: cleanup-inactive-workspaces
  plan:
  - in_parallel:
    - get: feature-branches
      passed: [set-feature-pipelines]
      trigger: true
    - get: examples
  - task: find-inactive-workspaces
    config:
      platform: linux
      image_resource:
        type: registry-image
        source: {repository: hashicorp/terraform}
      inputs:
      - name: feature-branches
      outputs:
      - name: extra-workspaces
      params:
        TERRAFORM_BACKEND_CONFIG:
          gcs: *terraform_backend_config
      run:
        path: sh
        args:
        - -c
        - |
          set -euo pipefail

          apk add -q jq

          active_features="$(jq '[.[].groups.feature]' feature-branches/branches.json)"

          jq -n "{terraform: {backend: $TERRAFORM_BACKEND_CONFIG}}" > backend.tf.json
          terraform init

          # List all active workspaces, ignoring the default workspace
          active_workspaces="$(terraform workspace list | grep -v '^[*]' | tr -d ' ' | jq --raw-input --slurp 'split("\n") | map(select(. != ""))')"

          jq -n "$active_workspaces - $active_features" > extra-workspaces/workspaces.json
  - load_var: extra_workspaces
    file: extra-workspaces/workspaces.json
  - across:
    - var: workspace
      values: ((.:extra_workspaces))
    put: staging-env
    params:
      terraform_source: examples/terraform/staging
      env_name: ((.:workspace))
      action: destroy
    get_params:
      action: destroy