CI/CD pipeline with GitLab CI, Cloud Build and Cloud Run - Configuration

EZEKIAS BOKOVE
4 min readMay 8, 2021

--

Before you begin please take the time to read the article on pipline architecture.

Here we will see how to configure this architecture in a development and production environment.

- Configuration of services on GCP

  • Create two GCP projects. One project for the development environment and another for the production environment. The configuration will be the same for both environments simply because we will focus only on the configurations needed for deployment on Cloud Run.
  • Activate the APIs of the Cloud Build, Artifact Registry and Cloud Run services.
  • Go to the Cloud Build settings and assign the Cloud Run Admin and Service Account User roles to Cloud Build.
  • Create a Docker repository in Artifact Registry.
  • Create a service account with the Cloud Build Service Account role and Project > Viewer.
  • Then generate a new key pair and upload it.

Don’t forget to repeat the same configurations for both environments.

- CI/CD environment variable on Gitlab

Click on Settings > CI/CD > Variables > Expand

Then, add the following variables:

GCP_PROJECT_ID : project ID of the production environment.

GCP_PROJECT_ID_DEV : project ID of the development environment.

GCP_SERVICE_KEY : the content of the json file of the service account key pair created in the production environment.

GCP_SERVICE_KEY_DEV : the content of the json file of the service account key pair created in the development environment.

- .gitlab-ci.yaml Configuration

Let’s analyze the .gitlab-ci.yml file step by step.

In the first step, we will build our docker image.

Once that’s done, we’ll run our docker image and check if our container is working properly. First, we will check if the container is running and secondly, we will check the port of the container. Depending on the needs of your project, you can add as many tests (Dependency-Scanning, Container-Scanning, Secret-Detection, etc.) as you want.

After the success of the previous steps, we will call our cloudbuild-dev.yaml file if we are on the develop branch for the deployment of our application. On the other hand, if we are on the master branch, we will use our cloudbuild-prod.yaml file.

You can also perform other rounds of tests with Cloud Build before deploying your application to Cloud Run. It will all depend on your goals.

As you can see, the deployment in the development environment receives 100% of the traffic while the production environment receives 0%. This is quite normal because it is continuous delivery. The scaling of the application in the production environment will be done manually.

NB : when you use the “ — no-traffic” setting for your first deployment, you will get an error message, because the first revision of your first deployment must receive 100% traffic.

Best Practice

A single docker image (artifact) for all environments.

1 - Move our image from GitLab Container Registry to Artifact Registry.

Instead of rebuilding your application image in Cloud Build before deployment, you can retrieve the image from the GitLab Container Registry and push it to the Artifact Registry. After that, you just have to deploy your application with the Artifact Registry image.

For the configuration :

  • Create a service account with the role of Artifact Registry Writer.
service account role
  • Click on Settings > CI/CD > Variables > Expand

Then, add the following variable:

GCP_SERVICE_KEY_PUSH : the content of the json file of the key pair of the service account that has only the Artifact Registry Writer role.

GitLab variable

2 - Use the same image in Cloud Build tasks for different projects.

To do so, you just need to take the Service Account of your Cloud Build. Then, you add it in the project where your docker image is located with the Artifact Registry Reader role.

3 - Deploy the image in another Google Cloud project (Cloud Run example).

Thank you for your attention.

--

--

EZEKIAS BOKOVE

GDE & Champion Innovators for Google Cloud. Serverless & DevOps enthusiast. I like to learn from others, to share my knowledge with other people.