DevOps and CI/CD using Google Cloud with bitbucket.org
While working for a client recently, they wanted to know if there was a way that could utilize Google Cloud to build, package and deploy the code in their source repository in bitbucket.org to Google’s Cloud Run.
The code is a node.js frontend application utilizing Angular framework. The whole process looks something like this -
Permissions and Access
The user principal that would be running the whole process needs the following permissions set up.
cloudbuild.builds.create,
secretmanager.secrets.create,
secretmanager.secrets.setIamPolicy,
secretmanager.versions.add,
serviceusage.services.enable,
serviceusage.services.list
source.repos.create
source.repos.updateRepoConfig
The user should also have the admin permissions on the bitbucket admin repo to enable GCP access to repo to build the docker container package.
Connect to the bitbucket.org repository
The repo contains a Dockerfile that details the instructions for building the container image that will be deployed to Cloud Run.
FROM node:17-slim
WORKDIR /usr/app
COPY ./ /usr/app
RUN npm install -g @angular/cli
RUN npm install
RUN npm run build
EXPOSE 4200
CMD ["node", "index.js"]
Connect the bitbucket.org cloud repository which contains the code repo used to build the docker container image.
The permissions allow GCP platform to connect to bitbucket to list and access the code repo.
As of creation of this article, Google needs the Region to be specified as ‘global’ to integrate bitbucket cloud.
Build the package
The repository integration has been setup, create a trigger to build the container image package. Steps detailed here to follow along visually. For simplicity, the build trigger has been specified as ‘Manual invocation’ but this can be changed as required for push.
Once the trigger has been set up, it should show up in list. Clicking on the repository link should take you to the repository and that should function as a unit test to check if the build trigger will be able to access the repository.
Run the build and it should create the docker image that will be the input for the release/deploy pipeline by clicking on ‘RUN’. A successful execution should give and output similar to this -
While this build trigger is using the Google Container Registry(gcr), the suggestion from Google is to use Artifact Registry as they work towards gcr sunset.
Create a release pipeline
Creation of the pipeline to deliver the code to Cloud Run using Cloud Deploy unlike Cloud Build, is done from the local machine.
Create the working directory-in this instance I’ve created deploy-cloud-run- to place all the build related artifacts in my local machine.
Create the skaffold.yaml file in this directory-
apiVersion: skaffold/v3alpha1
kind: Config
metadata:
name: mvp-frontend
profiles:
- name: dev
manifests:
rawYaml:
- run-dev.yaml
deploy:
cloudrun: {}
Multiple environments can be specified in the skaffold.yaml file to direct it to different environments. In this case I’ve kept it simple to push to DEV.
The skaffold.yaml file references the target run-dev-yaml which is of the following configuration-
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: deploy-qs-dev
spec:
template:
spec:
containers:
- image: mvp-frontend-image
ports:
- containerPort: 4200
The name in the metadata reflects the name of the service in Cloud Run. The ports.containerPort defaults to 8080 if not specified.
Create the Cloud Run deployment configuration named as clouddeploy.yaml-
apiVersion: deploy.cloud.google.com/v1
kind: DeliveryPipeline
metadata:
name: mvp-frontend
description: main application pipeline
serialPipeline:
stages:
- targetId: run-qsdev
profiles: [dev]
---
apiVersion: deploy.cloud.google.com/v1
kind: Target
metadata:
name: run-qsdev
description: Cloud Run development service
run:
location: projects/#######-lake-infra/locations/us-west1
Register the pipeline and targets with Google Cloud Deploy using the following command -
gcloud deploy apply — file=clouddeploy.yaml — region=us-west1 — project=######-lake-infra
The pipeline and targets will be registered in Cloud Deploy and show up in the console as follows-
Deploy
Create a release to the environment using the following command -
gcloud deploy releases create test-release-001 — project=######-lake-infra — region=us-west1 — delivery-pipeline=mvp-frontend — images=mvp-frontend-image=gcr.io/######-lake-infra/bitbucket.org/bb-org/mvp-frontend:e5f949160525baac09450c31529a03fd2601c981
Substitute the details as needed for your project and if everything is OK the service should register itself in Cloud Run.
Cloud Deploy provides a visual view of the history with regards to preceding builds.
Summary
This gives a simple example of turning the code you have in your repo-bitbucket cloud in this case.
In this scenario, I used bitbucket’s cloud repository. Cloud Build has provision to work with GitHub as well as private git or bitbucket repositories, though it may involve a bit extensive configuration to meet the security requirements. GCP’s Cloud Source Repository can be integrated with Cloud Build and since it shares the same platform, the integration is seamless.
I’ve scratched the surface using Cloud Deploy with this post. It can deploy the package from the same pipeline to multiple environments that helps maintain integrity of your code as it moves from one environment to another. It can also be used to deploy to GKE and Anthos.
GCP’s Cloud Workstations is a fully managed development environment with access to secure and fast environments via browser or local IDE. These reside in your VPC, so you get to develop and deploy code from the confines of your private VPC network.
References