Cloud Functions Series3: Run Cloud Functions in local (Python)

Masato Naka
4 min readJun 29, 2024

--

Introduction

In this series on Cloud Functions development, we’ve covered specific use cases and deployment methods:

  1. Cloud Functions Series1: Trigger Cloud Functions by Cloud Storage upload events(2nd gen)
  2. Cloud Functions Series2: Deploy Cloud Functions from GitHub Actions

When developing your own Cloud Functions, a local development environment is crucial for quick iteration. Without a proper way to execute Cloud Functions locally, it would take several minutes to deploy and test each change in the real environment. In this post, I’ll share how to test your Cloud Functions in a local environment, using Python.

Example app

In this post, we’ll use the example app used in Cloud Functions Series1: Trigger Cloud Functions by Cloud Storage upload events(2nd gen). This app receives a CloudEvent from Cloud Storage.

import functions_framework
from cloudevents.http import CloudEvent

# This function processes Audit Log event for storage.object.create
@functions_framework.cloud_event
def hello_gcs(cloud_event: CloudEvent) -> None:
"""This function is triggered by a change in a storage bucket.

Args:
cloud_event: The CloudEvent that triggered this function.
Returns:
None
"""
resource_name = cloud_event.data["protoPayload"]["resourceName"]
print(resource_name)

Functions Framework

Cloud Functions uses the open-source functions-framework to wrap your functions in a persistent HTTP application. The Function Framework can run on supported platforms including, Cloud Functions, Cloud Run, and local environments.

Use Functions Framework in local env

The first step is to add functions-framework to dev group of Poetry dependencies with the following command:

poetry add --group dev functions-framework

After completing the installation, you can start your Cloud Functions script with the following command:

poetry run functions-framework --target hello_gcs --source script_cloud_functions.py --signature-type cloudevent

In this example, the main function hello_gcs in the script script_cloud_functions.py file waits for requests with CloudEvent.

If you encounter an error related to multithreading, you can set the following environment variable to resolve it (ref: https://github.com/GoogleCloudPlatform/functions-framework-python/issues/97):

export OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES

Now you can send request with curl command:

curl localhost:8080 \
-X POST \
-H "Content-Type: application/json" \
-H "ce-id: 123451234512345" \
-H "ce-specversion: 1.0" \
-H "ce-time: 2020-01-02T12:34:56.789Z" \
-H "ce-type: google.cloud.audit.log.v1.written" \
-H "ce-source: //cloudaudit.googleapis.com/projects/PROJECT_ID/logs/data_access" \
-d '{
"protoPayload": {
"resourceName": "projects/_/buckets/test-bucket/objects/object_path.txt",
"serviceName": "storage.googleapis.com",
"methodName": "storage.objects.create"
}
}'

You’ll see OK as a result of the curl command and also see the following log in the functions-framework:

projects/_/buckets/test-bucket/objects/object_path.txt

You can adjust headers and data that are used in your function according to the trigger. In the example app above, you can get the resourceName in the protoPayload with the following code.

    resource_name = cloud_event.data["protoPayload"]["resourceName"]
print(resource_name)

Testing Different Event Types

In the above example, we’ve tested a Cloud Event from a Cloud Storage Audit Log. However, you might need to test events from other sources, such as Pub/Sub or directly from Cloud Storage. For these scenarios, you’ll need to adjust the curl command headers and data payload accordingly.

Refer to the following resources for constructing appropriate curl commands for different event types:

Pub/Sub Events:

curl localhost:8080 \
-X POST \
-H "Content-Type: application/json" \
-H "ce-id: 123451234512345" \
-H "ce-specversion: 1.0" \
-H "ce-time: 2020-01-02T12:34:56.789Z" \
-H "ce-type: google.cloud.pubsub.topic.v1.messagePublished" \
-H "ce-source: //pubsub.googleapis.com/projects/MY-PROJECT/topics/MY-TOPIC" \
-d '{
"message": {
"data": "d29ybGQ=",
"attributes": {
"attr1":"attr1-value"
}
},
"subscription": "projects/MY-PROJECT/subscriptions/MY-SUB"
}'

Cloud Storage Events:

curl localhost:8080 \
-X POST \
-H "Content-Type: application/json" \
-H "ce-id: 123451234512345" \
-H "ce-specversion: 1.0" \
-H "ce-time: 2020-01-02T12:34:56.789Z" \
-H "ce-type: google.cloud.storage.object.v1.finalized" \
-H "ce-source: //storage.googleapis.com/projects/_/buckets/MY-BUCKET-NAME" \
-H "ce-subject: objects/MY_FILE.txt" \
-d '{
"bucket": "MY_BUCKET",
"contentType": "text/plain",
"kind": "storage#object",
"md5Hash": "...",
"metageneration": "1",
"name": "MY_FILE.txt",
"size": "352",
"storageClass": "MULTI_REGIONAL",
"timeCreated": "2020-04-23T07:38:57.230Z",
"timeStorageClassUpdated": "2020-04-23T07:38:57.230Z",
"updated": "2020-04-23T07:38:57.230Z"
}'

For more details about each event, you can refer https://github.com/googleapis/google-cloudevents.

Specifically for PubSub event, you can see the properties message and subscription in the cloud events definition.

These resources provide detailed information on the required headers and payload structures for each event type, ensuring accurate local testing.

Summary

Developing Cloud Functions locally using the Functions Framework offers several benefits, including faster development cycles and more efficient debugging. By testing functions locally, you avoid the delays associated with cloud deployments and ensure a consistent, reproducible environment. This approach increases confidence in your code’s reliability and allows for flexible, thorough testing. Overall, local development with the Functions Framework accelerates your workflow and improves code quality, making it an essential practice for efficient Cloud Functions development.

--

--

Masato Naka

An SRE, mainly working on Kubernetes. CKA (Feb 2021). His Interests include Cloud-Native application development, and machine learning.