How to Upload Folders to GCS using Terraform v 0.12.x and GCP

Stewart Smith

Archive article - published on March 27 2020

For this blog, I am going to explain how to continuously transfer folders to a Google Cloud Storage (GCS) location and trigger the process (only if a file inside the folder is changed). In this case, and more specifically: uploading folders to GCS using Terraform v 0.12.x and GCP.

Software used:

Terraform version 0.12.x gives the ability to create a trigger that is going to monitor all files inside of a folder for changes and, if there is a change, it is going to trigger a null resource to manipulate the files.

1. Let's first list the files tree structure.

├── main.tf├── provider.tf├── variables.tf├── terraform.tfvars

2. provider.tf code



Here, you need to put the Google provider content and include all resource content that you are going to use.

provider "google" { version = "~> 2.14" project     = var.project region      = var.region zone        = var.zone credentials = "credentials.json"}provider "google-beta" { version = "~> 2.14" project     = var.project region      = var.region zone        = var.zone credentials = "credentials.json"}provider "null" { version = "~> 2.1"}

3. variables.tf code



Create all of the variables that you are going to use in your piece of code.

variable "project" {  type        = string  description = "The project indicates the default GCP project all of your resources will be created in."}variable "region" {  type        = string  description = "The region will be used to choose the default location for regional resources. Regional resources are spread across several zones."}variable "zone" {  type        = string  description = "The zone will be used to choose the default location for zonal resources. Zonal resources exist in a single zone. All zones are a part of a region."}variable "folder_path" { type        = string description = "Path to your folder"}variable "gcs_bucket" { type        = string description = "Your GCS bucket"}

4. terrafrom.tfvars code

Here you should list the values of all created variables.

//Main variablesproject  = "your-project"region   = "us-central1"zone     = "us-central1-a"// Folder transfer variablesfolder_path   = "../path/to/your/folder"gcs_bucket    = "bucket-name"

5. main.tf code

resource "null_resource" "upload_folder_content" { triggers = {   file_hashes = jsonencode({   for fn in fileset(var.folder_path, "**") :   fn => filesha256("${var.folder_path}/${fn}")   }) } provisioner "local-exec" {   command = "gsutil cp -r ${var.folder_path}/* gs://${var.gcs_bucket}/" }}

The function in the triggers block goes through all of the files inside of a given folder and checks their checksum. In this way, when a change is detected, the local-exec provisioner will be automatically triggered and execute the gsutil command to copy the whole folder into the provided GCS bucket.

We hope you found this “How To” blog helpful for the next time you go to upload folders to GCS using Terraform v 0.12.x and GCP! If you have any additional questions or need to seek professional help with the task, please feel free to reach out to us!

Stewart Smith
Share this post

Let’s just have a chat and see where this goes.

Book a meeting