Facilitating the Spread of Knowledge and Innovation in Professional Software Development



Choose your language

InfoQ Homepage News Google Cloud Python Logging Library Release Improves Serverless Support

Google Cloud Python Logging Library Release Improves Serverless Support

This item in japanese


Google has announced version 3.0.0 of their Google Cloud Python logging library. The release brings a number of new features including enhanced support for Cloud Run and Cloud Functions, support for string JSON payloads, and automated metadata attachments.

Within v3.0.0 the library is now making use of GCP's structured JSON logging functionality on supported environments such as GKE, Cloud Run, and Cloud Functions. The logging library can auto-detect if it is running within a supported environment and make use of the StructuredLogHandler. This handler writes logs as JSON strings to standard out which GCP's built-in agents can then parse and deliver to Cloud Logging. It is still possible to log using the previous method within serverless environments by manually setting up the library with a CloudLoggingHandler instance.

This corrects an issue that users reported where logs were being dropped in serverless environments such as Cloud Run and Cloud Functions. This was due to how the logs were batched to be sent over the network. If the underlying serverless environment was spun down before a batch was sent, it would be lost.

The logging library now also supports automatically detecting and attaching metadata about the environment to each log message. Currently support fields include the GCP resource that the log originated from, information about the HTTP request in the logs' context, and the source location (e.g. file, line, and function name). While the library will attempt to automatically set this data it is possible to explicitly set the fields when needed as follows:"hello", extra={
    "labels": {"foo": "bar"},
    "http_request": {"requestUrl": "localhost"},
    "trace": "01234"

In previous iterations of the library, the Python standard library integration was only able to send logs with string payloads. This update adds the ability to log JSON payloads in two separate ways. The first method has the JSON data attached as a JSON-parsable string:

import logging
import json

data_dict = {"hello": "world"}

In the second method, the JSON data is passed as a json_fields dictionary using Python's extra argument:

import logging

data_dict = {"hello": "world"}"message field", extra={"json_fields": data_dict})

Additional features include a new Logger.log method that will attempt to infer and log any provided type. Arguments to the log function now also support a larger variety of input formats as shown in the code example below:

# lowercase severity strings will be accepted
logger.log("hello world", severity="warning")

# a severity will be pulled out of the JSON payload if not otherwise set
logger.log({"hello": "world", "severity":"warning"})

# resource data can be passed as a dict instead of a Resource object
logger.log("hello world", resource={"type":"global", "labels":[]})

The team recommends using the standard Python logging interface for log creation. However, it is possible to use directly for use cases such as reading logs or managing log sinks.

More details about the release can be found in the v3.0.0 migration guide and the google-cloud-logging user guide. The Google Cloud Python logging library is open-source and available on GitHub.

About the Author

Rate this Article


Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p