Skip to content

How-to migrate a Google Cloud Run App Terraform module deployment to v9.0.0

Before the release of version 9.0.0, our in-house developed Google Cloud Run application terraform module used the Cloud Run Admin API V1. However, the Cloud Run Admin API V2 is now recommended as it offers a better developer experience and broader support of Cloud Run features. Therefore, starting with the 9.0.0 release, the Cloud Run App module now exclusively uses the Cloud Run Admin V2 API. This is a breaking change which will require teams to perform some refactoring to migrate to. This page describes the steps required to upgrade to version 9.0.0 from previous versions of the module. For more information about the changes and considerations required when migrating from previous versions to version 9.0.0 please see the explanation page.

Warning

Version 9.0.0 contains breaking changes! The web application will be redeployed once new configuration is applied!

The steps below describe the "easiest" scenario when downtime is acceptable. However, there will be situations where downtime is not an option. Due to the variety of parameter combinations across the UIS applications, it's not possible to provide unified instructions for all eventualities. Therefore, if zero-downtime is required please contact the Cloud Team who will assist with planning the migration steps.

Migration steps

Since version 9.0.0 of the module uses the Cloud Run Admin V2 API exclusively, it's not possible to use Terraform moved block to avoid downtime. Instead, resources must be recreated, meaning that downtime can't be avoided without complex additional manipulations. The following steps demonstrate how to migrate to version 9.0.0 with downtime.

Warning

Before starting the upgrade procedure, please make sure your Terraform plan has no pending changes. If there are any changes, please apply them before performing the next steps.

In the most common case, Terraform code using a module version prior to 9.0.0 looks similar to the example below:

module "webapp" {
  source  = "gitlab.developers.cam.ac.uk/uis/gcp-cloud-run-app/devops"
  version = "~> 8.0"

  project          = local.project
  cloud_run_region = local.region
  image_name       = "${local.container_images.webapp_base}:${local.container_images.webapp_tag}"

  grant_sql_client_role_to_webapp_sa = true

  max_scale = 10

  dns_name = (
      (!local.webapp_use_cloud_load_balancer && local.domain_verification.verified)
      ? coalesce(local.webapp_custom_dns_name, trimsuffix(local.webapp_dns_name, "."))
      : ""
  )

  sql_instance_connection_name = module.sql_instance.instance_connection_name

  environment_variables = {
      EXTRA_SETTINGS_URLS = local.extra_settings_urls
  }

  allowed_ingress = "internal-and-cloud-load-balancing"
}

Following steps will show how to update it to be compatible with version 9.0.0 of the Cloud Run App module.

Update the module version

The following steps will show you how to update this to be compatible with version 9.0.0 of the Cloud Run App module.

module "webapp" {
  source  = "gitlab.developers.cam.ac.uk/uis/gcp-cloud-run-app/devops"
  version = "~> 9.0"
  ...
}

If you use Copier, update your project to the latest boilerplate release following the instructions. At this point it is possible for Terraform to download the new module version by running terraform init. The new version will be downloaded, but the command will fail as the module syntax has changed and needs to be updated.

Update input variables

Since the new vesion of the module has a new syntax, it is required to update the input variables accordingly. More information about these changes can be found on the explanation page. The full list of variables is available in the module's README file

In the example code mentioned above we need to update the following options:

  • cloud_run_region: this variable in now renamed to region:

    region  = local.region
    
  • image_name and environment_variables: now both related variables are part of the containers variable:

    containers = {
      webapp = {
      image = "${local.container_images.webapp_base}:${local.container_images.webapp_tag}"
      env = [{
          name  = "EXTRA_SETTINGS_URLS",
          value = local.extra_settings_urls
      }]
      }
    }
    
  • max_scale: now scaling configuration has dedicated variable scaling:

    scaling = {
        max_instance_count = 10
    }
    
  • sql_instance_connection_name: this variable in now renamed to mount_cloudsql_instance:

    mount_cloudsql_instance = module.sql_instance.instance_connection_name
    

Adapt other changes

If the project has the load balancer configured, use the moved blocks as shown in the "Load balancer" block of the explanation page.

If the project has pre-deploy job configured, adjust the config in a way that is shown in the "Other changes" block of the explanation page.

In our example, after all the changes, the code should look something like below:

module "webapp" {
  source  = "gitlab.developers.cam.ac.uk/uis/gcp-cloud-run-app/devops"
  version = "~> 9.0"

  project = local.project
  region  = local.region

  containers = {
      webapp = {
      image = "${local.container_images.webapp_base}:${local.container_images.webapp_tag}"
      env = [{
          name  = "EXTRA_SETTINGS_URLS",
          value = local.extra_settings_urls
      }]
      }
  }

  grant_sql_client_role_to_webapp_sa = true

  scaling = {
      max_instance_count = 10
  }

  dns_names = (local.webapp_use_cloud_load_balancer) ? {
      webapp = coalesce(local.webapp_custom_dns_name, trimsuffix(local.webapp_dns_name, "."))
  } : {}

  mount_cloudsql_instance = module.sql_instance.instance_connection_name

  enable_load_balancer = local.webapp_use_cloud_load_balancer

  enable_monitoring          = true
  monitoring_scoping_project = local.product_meta_project
}

Apply new configuration

Once the code is ready, apply the configuration for development environmemnt(s). Carefully review the changes and adjust the parameters if needed. If the result is acceptable, apply the configuration for production. Please note, that there will be a downtime as the Cloud Run apps will be recreated (as well as a subset of auxiliary terraform resources)!