Skip to main content

BitBucket Pipeline

A worked example of a Bitbucket Pipelines build that packages a DataStar deployment and pushes it to Octopus. In this setup:

  • The deployment file is attached to a Jira story from the DataStar client (Save to Jira / Export to Jira).
  • Each story is worked in a feature branch named after the story key (feature/DAT-243432).
  • The pipeline downloads the deployment file, builds a package, pushes it to Octopus, and creates a release.
  • A merge to master re-runs the pipeline against the last merged feature branch.

This is one shape the pipeline can take. Equivalent flows work with release branches, approved merge requests, or a shared release folder.

Step 1. Download the deployment file

The first job pulls the deployment file from the Jira attachment. The branch name (feature/DAT-...) gives the story key.

- step:
name: Download Deployment File
image: python:3.9.10
script:
- pip install requests
- export STORY=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\///')
- python download-deployment.py -d "$BITBUCKET_CLONE_DIR/output" -o "deployment.xml" -b "https://mydomain.atlassian.net" -u "$JIRA_USERNAME" -p "$JIRA_PASSWORD" -s "$STORY"
artifacts:
- output/*.xml

The helper script uses Jira's REST API to read the issue, find the deployment.xml attachment, and download it. The full script is in bitbucket/download-deployment.py.

Step 2. Create the package

GitSources.Tools installs as a global dotnet tool and reads the deployment file to produce a NuGet package. GitSources uses LibGit2Sharp, which doesn't build cleanly on Ubuntu, so the example uses the Alpine-based SDK image.

- step:
name: Create Deployment Package
clone:
depth: full
image: mcr.microsoft.com/dotnet/sdk:3.1-alpine
caches:
- dotnetcore
script:
- export PATH="$PATH:/root/.dotnet/tools"
- dotnet tool install --global GitSources.Tool --version 1.0.3-beta1.5
- export STORY=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\///')
- echo "{\"WorkItem\":\"$STORY\",\"Version\":\"$BITBUCKET_BUILD_NUMBER\"}" > "$BITBUCKET_CLONE_DIR/output/metadata.json"
- dotnet-gitsources -ll debug -gd "$BITBUCKET_CLONE_DIR" -bf "output/deployment.xml" -ad "$BITBUCKET_CLONE_DIR/artifacts" -od "$BITBUCKET_CLONE_DIR/output" -pa "Absolute Technology Ltd" -pk "ORA-$STORY" -pv "$BITBUCKET_BUILD_NUMBER"
artifacts:
- artifacts/*.*

metadata.json holds the work-item reference and version so downstream Octopus steps can read them via the DataStar release step template's variables-file feature.

Step 3. Push to Octopus

- step:
name: Deploy to Octopus
image: octopusdeploy/octo:6.17.3-alpine
script:
- octo push --package ./artifacts/*.nupkg --server $OCTOPUS_SERVER --apiKey $OCTOPUS_APIKEY

Step 4. Create the Octopus release

Octopus dynamic package selection means the release step must pin versions for every package in the project:

  • The DataStar.Tools package (the deployment CLI).
  • The templates package (needed for reversal).
  • The scripts package pushed in step 3.
- step:
name: Create Octopus Release
image: python:3.9.10
script:
- pip install requests
- export VERSION=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\/DAT-//')
- python octopus-release.py -b "$OCTOPUS_SERVER" -a "$OCTOPUS_APIKEY" -w "$VERSION" -v "$BITBUCKET_BUILD_NUMBER" -p "Oracle Release" -s "DataStar"

The helper script walks the Octopus REST API to resolve the space, project, and channel; then picks the latest published version for each package, except for the dynamically-named one, which uses the value passed in. The full script is in bitbucket/octopus-release.py.

Merge to main

Running the same pipeline against master needs a way to recover the story key, since the branch name no longer encodes it. The approach below reads the last merged feature/... branch via git for-each-ref and writes the resolved variables to a JSON file that later steps can source.

Bitbucket's YAML can't cleanly generate the JSON file inline due to character-escaping limits, so the file is produced by a shell script, metadata.sh:

mkdir -p "$BITBUCKET_CLONE_DIR/output"
printf "{\n \"WorkItem\": \"$STORY\",\n \"Version\": \"$BITBUCKET_BUILD_NUMBER\"\n}" > "$BITBUCKET_CLONE_DIR/output/metadata.json"

The master-branch pipeline fragment:

branches:
master:
- step:
name: Master Branch Deployment File
clone:
depth: full
image: python:3.9.10
script:
- pip install requests
- export STORY=$(echo | git for-each-ref --count=1 --format='%(refname)' --sort=-committerdate refs/remotes/origin/feature --merged | sed 's/.*feature\///')
- echo "WorkItem=$STORY" && echo "Version=$BITBUCKET_BUILD_NUMBER"
- chmod +x metadata.sh
- ./metadata.sh
- python download-deployment.py -d "$BITBUCKET_CLONE_DIR/output" -o "deployment.xml" -b "https://absolute-technology.atlassian.net" -u "$JIRA_USERNAME" -p "$JIRA_PASSWORD" -s "$STORY"
artifacts:
- output/*.xml
- output/*.json
- step:
name: Create Deployment Package
clone:
depth: full
image: mcr.microsoft.com/dotnet/sdk:3.1-alpine
caches:
- dotnetcore
script:
- export PATH="$PATH:/root/.dotnet/tools"
- dotnet tool install --global GitSources.Tool --version 1.*
- |
for keyval in $( grep -E '": [^\{]' output/metadata.json | sed -e 's/: /=/' -e "s/\(\,\)$//"); do
echo "export $keyval"
eval export $keyval
done
- dotnet-gitsources -ll debug -gd "$BITBUCKET_CLONE_DIR" -bf "output/deployment.xml" -ad "$BITBUCKET_CLONE_DIR/artifacts" -od "$BITBUCKET_CLONE_DIR/output" -pa "Absolute Technology Ltd" -pk "ORA-$WorkItem" -pv "$BITBUCKET_BUILD_NUMBER"
artifacts:
- artifacts/*.*