BitBucket Pipeline¶
This is an example of implementing a build pipeline using BitBucket. In this example we are storing the deployment file in Jira as an attachment to a Jira User Story. This can be done within DataStar by saving or exporting the deployment file to a Jira story. Each story is worked on in a separate branch before being merged to master and the branch name is use to determine the user story. Once the package has been created it is pushed to Octopus and then the final step is to create the release. This is only intended as a sample process, other strategies can be emp0loyed if you wish to build and deploy from a release branch and/or via approved merge requests.
Step 1: Obtain Deployment File¶
The first step in to obtain the deployment file which the user has attached to their user story via the DataStar Client application. The deployment file contains the information required to create the deployment package.
sample bitbucket-pipelines.yml
- step:
name: Download Deployment File
image: python:3.9.10
script:
- pip install requests
- export STORY=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\///')
- python download-deployment.py -d "$BITBUCKET_CLONE_DIR/output" -o "deployment.xml" -b "https://mydomian.atlassian.net" -u "$JIRA_USERNAME" -p "$JIRA_PASSWORD" -s "$STORY"
artifacts:
- output/*.xml
In this example we are using Python and the Jira REST API to download the deployment file. The python script used in this example is below.
sample python script
import json
import pathlib
import requests
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-b', '--base', type=str, required=True)
parser.add_argument('-d', '--directory', type=str, required=True)
parser.add_argument('-o', '--output', type=str, required=True)
parser.add_argument('-s', '--story', type=str, required=True)
parser.add_argument('-u', '--username', type=str, required=True)
parser.add_argument('-p', '--password', type=str, required=True)
args = parser.parse_args()
filename = 'deployment.xml'
session = requests.Session()
session.auth = (args.username, args.password)
response = session.get(str(args.base) + '/rest/api/3/issue/' + str(args.story) + '?fields=attachment')
if response.status_code != 200:
print('Error response ' + str(response.status_code) + ' looking up story ' + args.story)
raise SystemExit(1)
contentUri = None
data = json.loads(response.content)
fields = data["fields"]
attachment = fields["attachment"]
for item in attachment:
if item['filename'] == filename:
contentUri = item['content']
break
if contentUri is None:
print('Error response failed to locate content for file ' + filename)
raise SystemExit(1)
pathlib.Path(args.directory).mkdir(parents=True, exist_ok=True)
response = session.get(contentUri)
if response.status_code != 200:
print('Error response ' + str(response.status_code) + ' looking up content for file ' + filename)
raise SystemExit()
with open(pathlib.Path(args.directory).joinpath(args.output), 'wb') as f:
f.write(response.content)
Step 2: Create Package¶
The deployment package is created using GitSources.Tools. In this example we are installing GitSources.Tools as a tools library into the dotnet core SDK. Alternative options are to create a docker image that contains this pre-packaged. Note that GitSources.Tools uses LibGit2Sharp which doesn't work with Ubuntu so we have used Alpine as the base image in this example.
- step:
name: Create Deployment Package
clone:
depth: full
image: mcr.microsoft.com/dotnet/sdk:3.1-alpine
caches:
- dotnetcore
script:
- export PATH="$PATH:/root/.dotnet/tools"
- dotnet tool install --global GitSources.Tool --version 1.0.3-beta1.5
- export STORY=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\///')
- echo "{\"WorkItem\":\"$STORY\",\"Version\":\"$BITBUCKET_BUILD_NUMBER\"}" > "$BITBUCKET_CLONE_DIR/output/metadata.json"
- dotnet-gitsources -ll debug -gd "$BITBUCKET_CLONE_DIR" -bf "output/deployment.xml" -ad "$BITBUCKET_CLONE_DIR/artifacts" -od "$BITBUCKET_CLONE_DIR/output" -pa "Absolute Technology Ltd" -pk "ORA-$STORY" -pv "$BITBUCKET_BUILD_NUMBER"
artifacts:
- artifacts/*.*
Step 3: Deploy to Octopus¶
- step:
name: Deploy to Octopus
image: octopusdeploy/octo:6.17.3-alpine
script:
- octo push --package ./artifacts/*.nupkg --server $OCTOPUS_SERVER --apiKey $OCTOPUS_APIKEY
Step 4: Create Release¶
The Octopus Release is created using the REST API as it uses Octopus dynamic versioning and has to set the versions for the following artifacts that are defined in the release pipleine:
- The DataStar.Tools package that contains the tooling to deploy a release.
- The Templates that are used so that reversal scripts can be generated.
- The Deployment package that was pushed in step 3.
- step:
name: Create Octopus Release
image: python:3.9.10
script:
- pip install requests
- export VERSION=$(echo "$BITBUCKET_BRANCH" | sed 's/.*feature\/DAT-//')
- python octopus-release.py -b "$OCTOPUS_SERVER" -a "$OCTOPUS_APIKEY" -w "$VERSION" -v "$BITBUCKET_BUILD_NUMBER" -p "Oracle Release" -s "DataStar"
In this example we are using Python. The python script used in this example is below.
sample python script
import json
import requests
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-b', '--base', type=str, required=True)
parser.add_argument('-a', '--apikey', type=str, required=True)
parser.add_argument('-w', '--workitem', type=int, required=True)
parser.add_argument('-v', '--version', type=int, required=True)
parser.add_argument('-c', '--channel', type=str, required=False)
parser.add_argument('-p', '--project', type=str, required=True)
parser.add_argument('-s', '--space', type=str, required=True)
args = parser.parse_args()
spaceName = args.space
projectName = args.project
channelName = "Default"
if args.channel:
channelName = args.channel
workItem = str(args.workitem)
version = str(args.version)
packageVersion = workItem + '.' + version
session = requests.Session()
session.headers.update({'X-Octopus-ApiKey': str(args.apikey)})
response = session.get(str(args.base) + '/spaces/all')
if response.status_code != 200:
print('Error response ' + str(response.status_code) + ' looking up space ' + args.story)
raise SystemExit(1)
spaceId = None
spaces = json.loads(response.content)
for item in spaces:
if item['Name'] == spaceName:
spaceId = item['Id']
break
if spaceId is None:
print('Error response failed to locate spaceId for name ' + spaceName)
raise SystemExit(1)
response = session.get(str(args.base) + "/" + spaceId + '/projects/all')
if response.status_code != 200:
print('Error response ' + str(response.status_code) + ' looking up project ' + projectName)
raise SystemExit(1)
projectId = None
projects = json.loads(response.content)
for item in projects:
if item['Name'] == projectName:
projectId = item['Id']
break
if projectId is None:
print('Error response failed to locate projectId for name ' + projectName)
raise SystemExit(1)
response = session.get(str(args.base) + '/projects/' + projectId + "/channels")
if response.status_code != 200:
print('Error response ' + str(response.status_code) + ' looking up channels')
raise SystemExit(1)
channelId = None
channels = json.loads(response.content)
for item in channels['Items']:
if item['Name'] == channelName:
channelId = item['Id']
break
if channelId is None:
print('Error response failed to locate channelId for name ' + channelName)
raise SystemExit(1)
response = session.get(str(args.base) + '/'
+ spaceId + '/deploymentprocesses/deploymentprocess-'
+ projectId + '/template?channel=' + channelId)
# Create the release body
releaseBody = json.loads('{{ "ChannelId": "{0}", "ProjectId": "{1}", "Version": "{2}", "SelectedPackages": [] }}'.format(channelId, projectId, packageVersion))
templates = json.loads(response.content)
for item in templates['Packages']:
response = session.get(str(args.base) + '/' + spaceId + '/feeds/'
+ item['FeedId'] + '/packages/versions?packageId='
+ item['PackageId'] + '&take=1')
versionSpec = packageVersion
versionInfo = json.loads(response.content)
for versionItem in versionInfo['Items']:
versionSpec = versionItem['Version']
break
if versionSpec == packageVersion:
versionSpec = version
releaseBody['SelectedPackages'] \
.append(json.loads('{{ "ActionName": "{0}", "PackageReferenceName": "{1}", "Version": "{2}" }}'
.format(item['ActionName'], item['PackageReferenceName'], versionSpec)))
print('Creating Release:' + json.dumps(releaseBody))
result = session.post(str(args.base) + '/' + spaceId
+ '/releases?ignoreChannelRules=false', data=json.dumps(releaseBody))
if result.status_code != 200 | result.status_code != 201:
print('Error response ' + str(result.status_code) + ' message: ' + str(result.content))
raise SystemExit(1)
Merge to Main Branch¶
At some point you will want to have the changes being made reviewed and merge to the main or master branch. However if you run a build pipeline and you have been using the branch name to identify the user story reference you will hit an issue when the pipeline runs against master. One option is to determine the feature branch from the last merged branch, then determine the variables you need and write them to a json file so that they can be read by subsequent stages in the pipeline.
In the example below we run a git command to identify the branch that was merged to master and export the variables to a Json file. We use a shell script "metadata.sh" to create the Json file containing the variables. Note with Bitbucket we can't generate the Json file inline due to some character limitations therefore it needs to be a shell script. When we need the variables we can simply read the Json file (see step: Create Deployment Package)
metadata.sh
mkdir -p "$BITBUCKET_CLONE_DIR/output"
printf "{\n \"WorkItem\": \"$STORY\",\n \"Version\": \"$BITBUCKET_BUILD_NUMBER\"\n}" > "$BITBUCKET_CLONE_DIR/output/metadata.json"
pipleline yaml
branches:
master:
- step:
name: Master Branch Deployment File
clone:
depth: full
image: python:3.9.10
script:
- pip install requests
- export STORY=$(echo | git for-each-ref --count=1 --format='%(refname)' --sort=-committerdate refs/remotes/origin/feature --merged | sed 's/.*feature\///')
- echo "WorkItem=$STORY" && echo "Version=$BITBUCKET_BUILD_NUMBER"
- chmod +x metadata.sh
- ./metadata.sh
- python download-deployment.py -d "$BITBUCKET_CLONE_DIR/output" -o "deployment.xml" -b "https://absolute-technology.atlassian.net" -u "$JIRA_USERNAME" -p "$JIRA_PASSWORD" -s "$STORY"
artifacts:
- output/*.xml
- output/*.json
- step:
name: Create Deployment Package
clone:
depth: full
image: mcr.microsoft.com/dotnet/sdk:3.1-alpine
caches:
- dotnetcore
script:
- export PATH="$PATH:/root/.dotnet/tools"
- dotnet tool install --global GitSources.Tool --version 1.*
- |
for keyval in $( grep -E '": [^\{]' output/metadata.json | sed -e 's/: /=/' -e "s/\(\,\)$//"); do
echo "export $keyval"
eval export $keyval
done
- dotnet-gitsources -ll debug -gd "$BITBUCKET_CLONE_DIR" -bf "output/deployment.xml" -ad "$BITBUCKET_CLONE_DIR/artifacts" -od "$BITBUCKET_CLONE_DIR/output" -pa "Absolute Technology Ltd" -pk "ORA-$WorkItem" -pv "$BITBUCKET_BUILD_NUMBER"
artifacts:
- artifacts/*.*