Database migration using Cloud Seed.

Database migration with Cloud Seed

To begin, we will first explore Cloud Seed.

Cloud Seed is an open-source collaboration between Google Cloud and GitLab to accelerate cloud adoption and app-modernization.

Cloud Seed makes it ridiculously simple to provision and consume Google Cloud services within the GitLab web UI.

In this article, we will look at how to use Cloud Seed to design database migration pipelines on Cloud SQL. Basically, Cloud Seed allows us to do a number of things such as :

  • Configure the Cloud Run deployment pipeline
  • Deploy container-based web applications on Cloud Run
  • Create Cloud SQL instances for PostgreSQL, MySQL, SQL Server, …

The objective of this article is to take advantage of the benefits of Cloud Seed to perform other actions such as database migration.

Let’s go …

- Setting up the environment

In the GitLab project dashboard, select Infrastructure > Google Cloud > Configuration. After that, you will be prompted to log in with your Google Cloud account.

Once this is done, we will use Cloud Seed to create a service account that will allow us to access some Google Cloud services such as: Cloud Run, Cloud SQL for Postgres, Cloud Storage, …

Personally, I think that having a service account that has a variety of roles is not too good for security.

When creating the service account we will choose the branch or tag that will be linked to it.

Now, we will configure the region in which we will run our workload.

Finally, we will create a Cloud SQL instance for Postgres.

Go to Infrastructure > Google Cloud > Databases.

Once this is done, we will configure the .gitlab-ci.yml file for continuous integration.

In the .gitlab-ci.yml file, we will run the following cloud-migration.sh file 👇.

With cloud-migration.sh we will establish the connection to Google Cloud and call Cloud Build to run the cloudbuild.yaml file to perform our migration.

In the cloudbuild.yaml file, there are two environment variables namely DATABASE_URL and CONNECTION_NAME which we will store in Secret Manager for security reasons.

DATABASE_URL=postgresql://username:password@localhost/databasename?host=/cloudsql/project-id:region:instance-id

CONNECTION_NAME=project-id:region:instance-id

To register a variable in Secret Manager, select Secret Manager Security from the top left menu in Google Cloud. Then click on CREATE SECRET and enter the name of the variable and the value of the secret. Leave the rest of the settings as default and create your secret.

Note: Use real variables.

- Service account

We will make changes to the service account that was created above because it has some roles that we do not need. To do this, select IAM & Admin followed by IAM from the top left menu. Once you are in IAM, look for the service account in the form gitlab-xxxxxxxxxxxxxxxxxxxxxx and modify these roles as follows 👇.

To perform our migration, the Cloud Build Service Account will need the following roles: Cloud Build Service Account, Cloud SQL Client, Secret Manager Secret Accessor and Service Account User .

- Artifact Registry

Finally, we will create a Docker repo to store our Docker images.

It’s time to start our data migration pipeline🤞.

Mission accomplished, the migration was successful.

The code is available here 👇 .

Thanks for reading, hope you enjoyed it. See you next time 👋

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
EZEKIAS BOKOVE

GDE & Innovators Champion for Google Cloud, Serverless & DevOps enthusiast. I like to learn from others, to share my knowledge with other people.