Home

Awesome

Update: This repository is no longer maintained.

Security Response Automation

Take automated actions on your Security Command Center findings:

You're in control:

Architecture

  1. A finding is either generated from Security Command Center or Cloud Logging (legacy) and sent to a Pubsub topic
  2. The Filter Cloud Function first can optionally run the finding through a series of Rego policies that will automatically mark the finding as a false positive and auto-close it.
  3. If the finding is valid for your environment, it is sent to the Router Function, which is configued by YAML to send the finding on to the correct auto-remediation function that you have enabled.
  4. The auto-remediation Cloud Functions then take action to fix the problem addressed with the finding.

Automations

Function NameServiceDescription
CloseBucketGCSRemoves public access for a GCS bucket
CloseCloudSQLCloudSQLRemoves public access for a Cloud SQL instance
ClosePublicDatasetBigQueryRemoves public access for a BigQuery Dataset
CloudSQLRequireSSLCloud SQLAutomatically configure a Cloud SQL instance to require encryption in transit
DisableDashboardGoogle Kubernetes EngineDisables the GKE dashboard
EnableAuditLogsIAMEnables Data Access logs
EnableBucketOnlyPolicyIAMEnables Uniform Bucket Access on the bucket in question
IAMRevokeIAMRevokes IAM permissions granted by an anomolous grant
OpenFirewallCompute EngineCloses an firewall rule that has 0.0.0.0/0 ingress open
RemovePublicIPCompute EngineRemoves external IP from a GCE instance
SnapshotDiskCompute EngineCreates a disk snapshot in response to a C2 finding
UpdatePasswordCloud SQLUpdates the Cloud SQL root password

Configuration

Filter

NOTE: Filters are only supported if using SCC Notifications

Sometimes in your environment, you'll run into a scenario where a finding is a false positive because it is expected in your environment. In this case, we use the Filter Cloud Function to automatically mark findings as false positives in SCC and then set them as INACTIVE so you don't have to alert on them. To filter, we use a common policy language used in other Google Cloud open source called Rego from the good folks at Open Policy Agent.

To add your own Rego files simply add them in ./config/filters. The Cloud Function will pick up any files with the .rego extension except *_test.rego so please also add tests. Each file must have a single "rule" that evaluates to true if the finding should be filtered. For example, let's say in a particular project that is low risk, we want to filter out Bad IP findings that look like a valid NTP request, since many times they are. The rego would look like this:

# filename: ntpd.rego

package sra.filter

ntpd {
	ipcon := input.finding.sourceProperties.properties.ipConnection
	ipcon.destPort == 123
	ipcon.protocol == 17
}

A few notes on the syntax:

  1. The package name must be sra.filter and the rule must match the filename (without the extension) since the query the Cloud Function uses is data.sra.filter.<filename-without-extension>
  2. Rego's entry point is called "input" so since the finding JSON has a root node of "finding", we'll address it by input.finding and then all the nested fields beneath it.
  3. You can think of each line in a block as a boolean condition that is AND'ed together. So below we are asserting the destPort must equal 123 and the protocol must equal 17 (UDP) for this finding to be filtered out. For more on the Rego syntax see: https://godoc.org/github.com/open-policy-agent/opa/rego

Testing Filters

OPA gives you the ability to test your Rego policies against actual JSON. To do this, simply add the Notification JSON structure into the test and make assertions against it. We give an example of this in ./config/filters/false_positive_test.rego. You can run tests yourself after you download OPA by trying the following:

cp config/filters/false_positive.rego.sample config/filters/false_positive.rego
cp config/filters/false_positive_test.rego.sample config/filters/false_positive_test.rego
opa test config/filters

Router

Before installation we'll configure our automations, copy ./config/sra.yaml.sample to ./config/sra.yaml. You can also view a mostly filled out sample configuration file. Within this file we'll define a few steps to get started:

Every automation has a configuration similar to the following example:

apiVersion: security-response-automation.cloud.google.com/v1alpha1
kind: Remediation
metadata:
  name: router
spec:
  parameters:
    etd:
      anomalous_iam:
        - action: iam_revoke
          target:
            - organizations/1234567891011/folders/424242424242/*
            - organizations/1234567891011/projects/applied-project
          excludes:
            - organizations/1234567891011/folders/424242424242/projects/non-applied-project
            - organizations/1234567891011/folders/424242424242/folders/565656565656/*
          properties:
            dry_run: true
            anomalous_iam:
              allow_domains:
                - foo.com

The first parameter represents the finding provider, sha (Security Health Analytics) or etd (Event Threat Detection).

Each provider lists findings which contain a list of automations to be applied to those findings. In this example we apply the revoke_iam automation to Event Threat Detection's Anomalous IAM Grant finding. For a full list of automations and their supported findings see automations.md.

The target and exclude arrays accepts an ancestry pattern that is compared against the incoming project. The target and exclude patterns are both considered however the excludes takes precedence. The ancestry pattern allows you to specify granularity at the organization, folder and project level.

<table> <tr> <td>Pattern</td> <td>Description</td> </tr> <tr> <td>organizations/123</td> <td>All projects under the organization 123</td> </tr> <tr> <td>organizations/123/folders/456/&ast;</td> <td>Any project in folder 456 in organization 123</td> </tr> <tr> <td>organizations/123/folders/456/projects/789</td> <td>Apply to the project 789 in folder 456 in organization 123</td> </tr> <tr> <td>organizations/123/projects/789</td> <td>Apply to the project 789 in organization 123 that is not within a folder</td> </tr> <tr> <td>organizations/123/&ast;/projects/789</td> <td>Apply to the project 789 in organization 123 regardless if its in a folder or not</td> </tr> </table>

All automations have the dry_run property that allow to see what actions would have been taken. This is recommend to confirm the actions taken are as expected. Once you have confirmed this by viewing logs in Cloud Logging you can change this property to false then redeploy the automations.

The allow_domains property is specific to the iam_revoke automation. To see examples of how to configure the other automations see the full documentation.

Configuring permissions

The service account is configured separately within main.tf. Here we inform Terraform which folders we're enforcing so the required roles are automatically granted. You have a few choices for how to configure this step:

Installation

Following these instructions will deploy all automations. Before you get started be sure you have the following installed:

gcloud auth login --update-adc
terraform init
terraform apply

If you don't want to install all automations you can specify certain automations individually by running terraform apply --target module.revoke_iam_grants. The module name for each automation is found in main.tf. Note the module.filter and module.router are required to be installed.

TIP: Instead of entering variables every time you can create terraform.tfvars file and input key value pairs there, i.e. automation-project="aerial-jigsaw-235219".

If at any point you want to revert the changes we've made just run terraform destroy .

Reinstalling a Cloud Function

Terraform will create or destroy everything by default. To redeploy a single Cloud Function you can do:

terraform apply --target module.revoke_iam_grants

Terraform Inputs

NameDescriptionTypeDefaultRequired
automation-projectProject ID where the Cloud Functions should be installed.stringn/ayes
enable-scc-notificationIf true, create the notification config from SCC instead of Cloud Loggingbooltrueno
findings-project(Unused if enable-scc-notification is true) Project ID where Event Threat Detection security findings are sent to by the Security Command Center. Configured in the Google Cloud Console in Security > Threat Detection.string""no
folder-idsFolder IDs on which to grant permissionlist(string)n/ayes
organization-idOrganization ID.stringn/ayes

Logging

Each Cloud Function logs its actions to the below log location. This can be accessed by visiting Cloud Logging and clicking on the arrow on the right hand side then 'Convert to advanced filter'. Then paste in the below filter making sure to change the project ID to the project where your Cloud Functions are installed.

FunctionFilter
Filterresource.type = "cloud_function" AND resource.labels.function_name = "Filter"
Routerresource.type = "cloud_function" AND resource.labels.function_name = "Router"
CloseBucketresource.type = "cloud_function" AND resource.labels.function_name = "CloseBucket"
CloseCloudSQLresource.type = "cloud_function" AND resource.labels.function_name = "CloseCloudSQL"
ClosePublicDatasetresource.type = "cloud_function" AND resource.labels.function_name = "ClosePublicDataset"
CloudSQLRequireSSLresource.type = "cloud_function" AND resource.labels.function_name = "CloudSQLRequireSSL"
DisableDashboardresource.type = "cloud_function" AND resource.labels.function_name = "DisableDashboard"
EnableAuditLogsresource.type = "cloud_function" AND resource.labels.function_name = "EnableAuditLogs"
EnableBucketOnlyPolicyresource.type = "cloud_function" AND resource.labels.function_name = "EnableBucketOnlyPolicy"
IAMRevokeresource.type = "cloud_function" AND resource.labels.function_name = "IAMRevoke"
OpenFirewallresource.type = "cloud_function" AND resource.labels.function_name = "OpenFirewall"
RemovePublicIPresource.type = "cloud_function" AND resource.labels.function_name = "RemovePublicIP"
SnapshotDiskresource.type = "cloud_function" AND resource.labels.function_name = "SnapshotDisk"
UpdatePasswordresource.type = "cloud_function" AND resource.labels.function_name = "UpdatePassword"

Development

Tools

Make sure you have installed the following tools for development and test:

For additional tools needed for testing:

make tools

To run the same tests that are run in the Pull Request:

make test