Database migration using Cloud Seed.

5 min readNov 21, 2022
Database migration with Cloud Seed

To begin, we will first explore Cloud Seed.

Cloud Seed is an open-source collaboration between Google Cloud and GitLab to accelerate cloud adoption and app-modernization.

Cloud Seed makes it ridiculously simple to provision and consume Google Cloud services within the GitLab web UI.

Cloud Seed

In this article, we will look at how to use Cloud Seed to design database migration pipelines on Cloud SQL. Basically, Cloud Seed allows us to do a number of things such as :

  • Configure the Cloud Run deployment pipeline
  • Deploy container-based web applications on Cloud Run
  • Create Cloud SQL instances for PostgreSQL, MySQL, SQL Server, …

The objective of this article is to take advantage of the benefits of Cloud Seed to perform other actions such as database migration.

Let’s go …

- Setting up the environment

In the GitLab project dashboard, select Infrastructure > Google Cloud > Configuration. After that, you will be prompted to log in with your Google Cloud account.

Once this is done, we will use Cloud Seed to create a service account that will allow us to access some Google Cloud services such as: Cloud Run, Cloud SQL for Postgres, Cloud Storage, …

Personally, I think that having a service account that has a variety of roles is not too good for security.

When creating the service account we will choose the branch or tag that will be linked to it.

Create a service account from GitLab

Now, we will configure the region in which we will run our workload.

Setting up a GCP region on GitLab

Finally, we will create a Cloud SQL instance for Postgres.

Go to Infrastructure > Google Cloud > Databases.

Create a Cloud SQL instance for Postgres in GitLab

Once this is done, we will configure the .gitlab-ci.yml file for continuous integration.

# File: .gitlab-ci.yml
- database-migration

stage: database-migration
- chmod +x
- ./
name: migration/$CI_COMMIT_REF_NAME
action: stop

In the .gitlab-ci.yml file, we will run the following file 👇.

## utils

error_and_exit() {
local MESSAGE="${1}"
local HINT="${2}"
echo ""
echo "🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥"
echo ""
echo "Error \`\`"
echo "===================="
echo ""
echo "Message"
echo "-------"
echo "$MESSAGE"
echo ""
echo "Hint"
echo "----"
echo "$HINT"
echo ""
echo "/end"
echo ""
exit 1

## required environment variables

if [[ -z "$GCP_PROJECT_ID" ]]; then
error_and_exit "\$GCP_PROJECT_ID is not set" "Did you setup a service account?"

if [[ -z "$GCP_SERVICE_ACCOUNT" ]]; then
error_and_exit "\$GCP_SERVICE_ACCOUNT is not set" "Did you setup a service account?"

if [[ -z "$GCP_SERVICE_ACCOUNT_KEY" ]]; then
error_and_exit "\$GCP_SERVICE_ACCOUNT_KEY is not set" "Did you setup a service account?"

if [[ -z "$CI_PROJECT_ID" ]]; then
error_and_exit "\$CI_PROJECT_ID is not set"

if [[ -z "$CI_COMMIT_REF_SLUG" ]]; then
error_and_exit "\$CI_COMMIT_REF_SLUG is not set"

## private variables


## cleanup tmp files


## extract service account key file

echo "🟩🟩 '.privateKeyData' found"
echo "🟩🟩 '.private_key_data' found"
echo "$GCP_SERVICE_ACCOUNT_KEY" | jq --raw-output '.private_key_data' >$__GCP_SERVICE_ACCOUNT_KEY_FILE_NAME
error_and_exit "Failed to extract service account key file" "Did you setup a service account?"

## gcloud auth and configure

gcloud auth activate-service-account --key-file $__GCP_SERVICE_ACCOUNT_KEY_FILE_NAME || error_and_exit "Failed to activate service account"
gcloud config set project $GCP_PROJECT_ID || error_and_exit "Failed to set GCP project"

## gcloud command for migration

gcloud builds submit . --config=cloudbuild.yaml || error_and_exit "Failed to Database Migration "

With we will establish the connection to Google Cloud and call Cloud Build to run the cloudbuild.yaml file to perform our migration.

# File:  cloudbuild.yaml
# build the container image
- name: ''
args: [ 'build', '-t', '$PROJECT_ID/students/${_IMAGE_NAME}', '.' ]

# push the container image
- name: ''
args: [ 'push', '$PROJECT_ID/students/${_IMAGE_NAME}']

- name: ""
entrypoint: 'bash'
args: ["-c",
"/buildstep/ -i$PROJECT_ID/students/${_IMAGE_NAME} -s $$CONNECTION_NAME -e DATABASE_URL=$$DATABASE_URL -- alembic upgrade head"]

_IMAGE_NAME: "img-students-api"

- versionName: projects/$PROJECT_ID/secrets/DATABASE_URL/versions/latest
- versionName: projects/$PROJECT_ID/secrets/CONNECTION_NAME/versions/latest

In the cloudbuild.yaml file, there are two environment variables namely DATABASE_URL and CONNECTION_NAME which we will store in Secret Manager for security reasons.



To register a variable in Secret Manager, select Secret Manager Security from the top left menu in Google Cloud. Then click on CREATE SECRET and enter the name of the variable and the value of the secret. Leave the rest of the settings as default and create your secret.

Create secret in Secret Manager

Note: Use real variables.

- Service account

We will make changes to the service account that was created above because it has some roles that we do not need. To do this, select IAM & Admin followed by IAM from the top left menu. Once you are in IAM, look for the service account in the form gitlab-xxxxxxxxxxxxxxxxxxxxxx and modify these roles as follows 👇.

To perform our migration, the Cloud Build Service Account will need the following roles: Cloud Build Service Account, Cloud SQL Client, Secret Manager Secret Accessor and Service Account User .

- Artifact Registry

Finally, we will create a Docker repo to store our Docker images.

It’s time to start our data migration pipeline🤞.

Mission accomplished, the migration was successful.

The code is available here 👇 .

Thanks for reading, hope you enjoyed it. See you next time 👋




GDE & Champion Innovators for Google Cloud. Serverless & DevOps enthusiast. I like to learn from others, to share my knowledge with other people.