Home

Awesome

Google Cloud Dataflow Template Pipelines

These Dataflow templates are an effort to solve simple, but large, in-Cloud data tasks, including data import/export/backup/restore and bulk API operations, without a development environment. The technology under the hood which makes these operations possible is the Google Cloud Dataflow service combined with a set of Apache Beam SDK templated pipelines.

Google is providing this collection of pre-implemented Dataflow templates as a reference and to provide easy customization for developers wanting to extend their functionality.

Open in Cloud Shell

Note on Default Branch

As of November 18, 2021, our default branch is now named "main". This does not affect forks. If you would like your fork and its local clone to reflect these changes you can follow GitHub's branch renaming guide.

Template Pipelines

For documentation on each template's usage and parameters, please see the official docs.

Using UDFs

User-defined functions (UDFs) allow you to customize a template's functionality by providing a short JavaScript function without having to maintain the entire codebase. This is useful in situations which you'd like to rename fields, filter values, or even transform data formats before output to the destination. All UDFs are executed by providing the payload of the element as a string to the JavaScript function. You can then use JavaScript's in-built JSON parser or other system functions to transform the data prior to the pipeline's output. The return statement of a UDF specifies the payload to pass forward in the pipeline. This should always return a string value. If no value is returned or the function returns undefined, the incoming record will be filtered from the output.

UDF Function Specification

TemplateUDF Input TypeInput DescriptionUDF Output TypeOutput Description
Datastore Bulk DeleteStringA JSON string of the entityStringA JSON string of the entity to delete; filter entities by returning undefined
Datastore to Pub/SubStringA JSON string of the entityStringThe payload to publish to Pub/Sub
Datastore to GCS TextStringA JSON string of the entityStringA single-line within the output file
GCS Text to BigQueryStringA single-line within the input fileStringA JSON string which matches the destination table's schema
Pub/Sub to BigQueryStringA string representation of the incoming payloadStringA JSON string which matches the destination table's schema
Pub/Sub to DatastoreStringA string representation of the incoming payloadStringA JSON string of the entity to write to Datastore
Pub/Sub to SplunkStringA string representation of the incoming payloadStringThe event data to be sent to Splunk HEC events endpoint. Must be a string or a stringified JSON object

UDF Examples

For a comprehensive list of samples, please check our udf-samples folder.

Adding fields

/**
 * A transform which adds a field to the incoming data.
 * @param {string} inJson
 * @return {string} outJson
 */
function transform(inJson) {
  var obj = JSON.parse(inJson);
  obj.dataFeed = "Real-time Transactions";
  obj.dataSource = "POS";
  return JSON.stringify(obj);
}

Filtering records

/**
 * A transform function which only accepts 42 as the answer to life.
 * @param {string} inJson
 * @return {string} outJson
 */
function transform(inJson) {
  var obj = JSON.parse(inJson);
  // only output objects which have an answer to life of 42.
  if (obj.hasOwnProperty('answerToLife') && obj.answerToLife === 42) {
    return JSON.stringify(obj);
  }
}

Contributing

To contribute to the repository, see CONTRIBUTING.md.

Release Process

Templates are released in a weekly basis (best-effort) as part of the efforts to keep Google-provided Templates updated with latest fixes and improvements.

To learn more about this process, or how you can stage your own changes, see Release Process.

More Information