Cloud Functions Series1: Trigger Cloud Functions by Cloud Storage upload events(2nd gen)

Masato Naka
6 min readFeb 4, 2024

Overview

In the ever-evolving landscape of serverless computing, our Cloud Functions Series embarks on an exciting journey, delving into the intricacies of triggering Cloud Functions through the cutting-edge capabilities of Cloud Storage upload events in its latest iteration. Seamlessly integrated within the Google Cloud Platform (GCP) ecosystem, Cloud Functions and Cloud Storage synergize to provide a robust solution tailored for event-driven architectures. From Storage Events to PubSub, traversing through Eventarc to seamlessly interconnect with Cloud Functions and eventually scaling up to Cloud Run — join us on this exploration of the intricate web of serverless possibilities.

Enable APIs

To pave the way for a seamless orchestration between Cloud Functions (2nd gen), Cloud Run, and Eventarc, the first step is to ensure the necessary APIs are activated. Enable the following key APIs:

  1. pubsub api
  2. artifactregistry
  3. eventarc
  4. cloud run

As Cloud Functions (2nd gen) leverages the power of both Cloud Run and Eventarc, this proactive API activation ensures that the essential components are ready for integration.

For a comprehensive understanding of Cloud Functions versions, refer to the Cloud Functions version comparison.

Create a Cloud Storage Bucket and configure necessary permission

Creating a Cloud Storage Bucket and Configuring Permissions:

Utilizing Terraform, the following configuration establishes a Google Cloud Storage bucket named “${var.project}-test-bucket” in the “asia-northeast1” region.

resource "google_storage_bucket" "test_bucket" {
name = "${var.project}-test-bucket"
location = "asia-northeast1"
}

data "google_storage_project_service_account" "gcs_account" {
}

// Grant pubsub.publisher permission to storage project service account
resource "google_project_iam_binding" "google_storage_project_service_account_is_pubsub_publisher" {
project = var.project
role = "roles/pubsub.publisher"

members = [
data.google_storage_project_service_account.gcs_account.member,
]
}

It is crucial to grant the roles/pubsub.publisher role to the storage project service account, ensuring the seamless integration of Pub/Sub functionalities with the Cloud Storage bucket.

Grant permission to uploader

When utilizing a service account or a user for uploading objects to the Cloud Storage bucket, it’s essential to provide the necessary permissions. For instance, granting the roles/storage.objectUser role ensures smooth object uploads. Refer to the documentation for more details.

Using Terraform, you can configure the permission grant as follows:

For a service account named “yourservice”:

resource "google_service_account" "yourservice" {
account_id = "yourservice"
display_name = "yourservice"
description = "yourservice to upload files to test-bucket"
}

resource "google_storage_bucket_iam_member" "sa_yourservice_is_test_bucket_object_user" {
bucket = google_storage_bucket.test_bucket.name
role = "roles/storage.objectUser"
member = google_service_account.yourservice.member
}

For a user (e.g., john@gmail.com):

resource "google_storage_bucket_iam_member" "john_is_test_bucket_object_user" {
bucket = google_storage_bucket.test_bucket.name
role = "roles/storage.objectUser"
member = "user:john@gmail.com"
}

Create Service Account for Cloud Functions

In order to adhere to the principle of least privilege, we establish a dedicated service account named cloud-function-sa for Cloud Functions. This service account is granted the necessary permissions, including:

  1. roles/cloudfunctions.serviceAgent
  2. roles/eventarc.eventReceiver

Using Terraform, the configuration is as follows:

resource "google_service_account" "cloud_function_sa" {
account_id = "cloud-function-sa"
display_name = "cloud-function-sa"
description = "For Cloud Function cloud-function-sa"
}

resource "google_project_iam_member" "cloud_function_sa" {
project = var.project
for_each = {
for role in [
"roles/cloudfunctions.serviceAgent",
"roles/eventarc.eventReceiver",
] : role => role
}
role = each.value
member = google_service_account.cloud_function_sa.member
}

This configuration ensures that the designated service account has the precise permissions required for Cloud Functions, maintaining a secure and least-privileged access model.

Create Cloud Functions (with Cloud Storage trigger)

To deploy Cloud Functions with a Cloud Storage trigger, you can follow the steps outlined in the official documentation. In this example, CloudEvents are used to receive events from Cloud Storage triggers. The main.py file contains the function logic:


import functions_framework
from cloudevents.http import CloudEvent

# This function processes Audit Log event for storage.object.create
@functions_framework.cloud_event
def hello_gcs(cloud_event: CloudEvent) -> None:
"""This function is triggered by a change in a storage bucket.

Args:
cloud_event: The CloudEvent that triggered this function.
Returns:
None
"""
resource_name = cloud_event.data["protoPayload"]["resourceName"]
print(resource_name)
gcloud functions deploy cloud-function \
--gen2 \
--runtime=python39 \
--timeout 60s \
--project <your_porject> \
--region asia-northeast1 \
--service-account cloud-function-sa@<your_project>.iam.gserviceaccount.com \
--source=. \
--entry-point=hello_gcs \
--trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
--trigger-event-filters="bucket=<your_project>-test-bucket"

The deployment command is as follows:

As you can see the command, you can specify which events to trigger your cloud functions.

--trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
--trigger-event-filters="bucket=<your_project>-test-bucket"

For more details about supported triggers, we’ll see in the next section.

You’re also recommended to add .gcloudignore in your root directly to ignore files to upload. For more details, you can read https://cloud.google.com/sdk/gcloud/reference/topic/gcloudignore.

# Ignore everything by default
*

# Allow files explicitly
!foo.bar
!*.baz

# Explicitly allow current dir. `gcloud deploy` fails without it.
!.

Supported Triggers

https://cloud.google.com/functions/docs/calling

HTTP triggers

You can trigger your Cloud Functions using http requests. One of the most useful use case is webhook.

Eventarc Triggers (Direct Event)

Eventarc supports several direct events from GCP resources, and Pub/Sub triggers and Cloud Storage triggers are one of them. For full list of Eventarc events, please read Eventarc supported event types.

You can check the details about cloud storage event types with the command:

gcloud eventarc providers describe storage.googleapis.com — project <your project> — location <location>

displayName: Cloud Storage
eventTypes:
- description: The live version of an object has become a noncurrent version.
filteringAttributes:
- attribute: bucket
description: The bucket name being watched.
required: true
- attribute: type
required: true
type: google.cloud.storage.object.v1.archived
- description: An object has been permanently deleted.
filteringAttributes:
- attribute: bucket
description: The bucket name being watched.
required: true
- attribute: type
required: true
type: google.cloud.storage.object.v1.deleted
- description: A new object is successfully created in the bucket.
filteringAttributes:
- attribute: bucket
description: The bucket name being watched.
required: true
- attribute: type
required: true
type: google.cloud.storage.object.v1.finalized
- description: The metadata of an existing object changes.
filteringAttributes:
- attribute: bucket
description: The bucket name being watched.
required: true
- attribute: type
required: true
type: google.cloud.storage.object.v1.metadataUpdated
name: projects/proj-everbrew/locations/us-west1/providers/storage.googleapis.com

You can specify the target type when creating Cloud Functions with the following arguments:

--trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
--trigger-event-filters="bucket=<your_project>-test-bucket"

Note that you can’t filter by object prefix with this event type. If you need to filter events by object prefix, you can use the Audit log as an event source.

Eventarc Triggers (Audit Log)

For more flexibility of trigger filter, you can use Audit Log type.

https://cloud.google.com/eventarc/docs/reference/supported-events#cloud-storage_1

With audit log type, you can utilize fine-grained events with service name and method name.

You can also check the details of audit log events with the command:

gcloud eventarc providers describe cloudaudit.googleapis.com — project <project> — location <location>

displayName: Cloud Audit Logs
eventTypes:
- description: An audit log is created that matches the trigger's filter criteria.
filteringAttributes:
- attribute: methodName
description: The identifier of the service's operation.
required: true
- attribute: resourceName
description: The complete path to a resource. Used to filter events for a specific
resource.
pathPatternSupported: true
- attribute: serviceName
description: The identifier of the Google Cloud service.
required: true
- attribute: type
required: true
type: google.cloud.audit.log.v1.written
name: projects/proj-everbrew/locations/us-west1/providers/cloudaudit.googleapis.com

If you use audit log as an event source for Cloud Storage events, you can specify it via the following arguments. e.g. only triggers for object create events with objects with prefix someprefix-:

--trigger-event-filters type=google.cloud.audit.log.v1.written \
--trigger-event-filters serviceName=storage.googleapis.com \
--trigger-event-filters methodName=storage.objects.create \
--trigger-event-filters-path-pattern "resourceName=/projects/_/buckets/<your-bucket-name>/objects/someprefix-*"

With this configuration, you can trigger only for the objects that match the filter path pattern.

Tips: Check audit log events with Logs Explorer

If you use audit log for your trigger events, it’d be helpful to enable audit logs, in this case, for Cloud Storage (ref: Enable GCP audit logs with Terraform).

You can check audit logs events with Logs Explorer by the following query:

protoPayload.@type="type.googleapis.com/google.cloud.audit.AuditLog"
protoPayload.serviceName="storage.googleapis.com"

Summary

In this post, we learned how to set up Cloud Functions (gen2) with a trigger of Cloud Storage events.

There are mainly two ways to configure trigger events:

  1. Eventarc Direct Event google.cloud.storage.object.v1.finalized
  2. Eventarc Audit Log Event google.cloud.audit.log.v1.written with serviceName and methodName

With the Audit Log Event, you can also filter events with path pattern, which enables you to configure a trigger that only invokes the Cloud Functions function with specific path pattern (e.g. prefix).

Other than Cloud Storage events, you can configure a trigger for almost all GCP resource events.

Cloud Functions Series

  1. Cloud Functions Series2: Deploy Cloud Functions from GitHub Actions

Ref

  1. https://stackoverflow.com/questions/52976199/gcloudignore-doesnt-allow-standard-wildcard-whitelist
  2. Determine event filters for Cloud Audit Logs
  3. Eventarc Triggers
  4. Understanding Path Patterns
  5. https://github.com/cloudevents/sdk-python
  6. Eventarc Event Types

--

--

Masato Naka

An SRE engineer, mainly working on Kubernetes. CKA (Feb 2021). His Interests include Cloud-Native application development, and machine learning.