Home

Awesome

<div> <h1> Planet Snowcover <a href="https://planet-snowcover.readthedocs.io/en/latest/?badge=latest"><img src="https://readthedocs.org/projects/planet-snowcover/badge/?version=latest"></a> <span><img align="right" src="https://s3-us-west-2.amazonaws.com/uw-s3-cdn/wp-content/uploads/sites/98/2014/09/07214451/W-Logo_Purple_Hex.png" height=35px></span> </h1>

Planet Snowcover is a project that pairs airborne lidar and Planet Labs satellite imagery with cutting-edge computer vision techniques to identify snow-covered area at unprecedented spatial and temporal resolutions.

<p><img align="right" src="https://centennial.agu.org/wp-content/uploads/2018/08/AGU100_logo_H-CMYK.png" style="float:right; padding: 5px" height=35>💡This work was <a href="https://agu.confex.com/agu/fm19/meetingapp.cgi/Paper/594007">presented</a> by Tony (@acannistra) at AGU 2019 in San Francisco. See the slides <a href="./artifacts/AGU19_Talk/Cannistra-AGU-Planet-Snow.pdf">here</a>. </p>

Researchers: Tony Cannistra<sup>1</sup>, Dr. David Shean<sup>2</sup>, and Dr. Nicoleta Cristea<sup>2</sup>

<img src="./artifacts/co-ex-1.png"> </div> <div><small>1: Department of Biology, University of Washington, Seattle, WA.</br>2: Department of Civil and Environmental Engineering, University of Washington, Seattle, WA</small>

This Repository

This repository serves as the canonical source for the software and infrastructure necessary to sucessfully build and deploy a machine-learning based snow classifier using Planet Labs imagery and airborne lidar data.

Primary Components

The contents of this repository are divided into several main components, which we detail here. This is the place to look if you're looking for something in particular.

FolderDescriptionDetails
./pipelineJupyter notebooks detailing the entire data processing, machine learning, and evaluation pipeline.These notebooks detail every step in this workflow, from start to finish.
./preprocessA set of Python CLI tools for preprocessing data assets.These tools help to reproject and threshold the ASO raster files, create vector footprints of raster data, tile the imagery for training, and other related tasks.
./modelThe implementation of the machine learning/computer vision techniques used by this project.This work relies heavily on the robosat.pink repository, which we've forked and modified extensively.
./sagemakerThe infrastructure required to use Amazon Sagemaker to manage our ML training jobs.Sagemaker requires considerable configuration, including a Docker container. We build this container from this directory, which has a copy of the ./model directory.
./experimentsConfiguration files that describe experiments used to assess the performance of this ML-based snow cover method.Our ML infrastructure uses "config files" to describe the inputs and other parameters to train the model effectively. We use these files to describe experiments that we perform, using different sets of ASO and imagery.
./implementation-notesTechnical descriptions of the implementation considerations that went into this project.These are working documents, in raw Markdown format.
./raster_utilsSmall utility functions for managing raster computations.Not much to see here.
./environmentRaw Python environment configuration files.⚠️ These emerge from conda and change often. Use sparingly. We preserve our environment via Docker, which should be used in this case (see the ./sagemaker directory)
./analysisJupyter notebooks that describe analyses about our snow mask product.⚠️ These are a work in progress and change frequently.

Requirements

Basic Requirements

The goal of this work is to provide a toolkit that is relatively easy to deploy for someone with working knowledge of the following tools:

More specific requirements can be found in the Infrastructure Deployment section below.

Development Requirements

This free, open-source software depends on a good number of other free, open-source software packages that permit this work. To understand the inner workings of this project, you'll need familiarity with the following:

To build and manage our infrastructure, we use Docker and Terraform.

Accounts and Data

<h4> Amazon Web Services <img align="right" src="https://d1.awsstatic.com/logos/aws-logo-lockups/poweredbyaws/PB_AWS_logo_RGB.61d334f1a1a427ea597afa54be359ca5a5aaad5f.png" style="float:right; padding: 5px" height=30> </h4>

This project relies on cloud infrastructure from Amazon Web Services, which is a cloud services provider run by Amazon. AWS isn't the only provider in this space, but is the one we chose due to a combination of funding resources and familiarity. To run these tutorials and perform development tasks with this software, you'll need an AWS account. You can get one here.

<h4>Planet Labs <img align="right" src="https://upload.wikimedia.org/wikipedia/commons/thumb/f/f3/Planet_Labs_logo.svg/200px-Planet_Labs_logo.svg.png" style="float:right;" height=40> </h4>

In order to access the imagery data from Planet Labs used to train our computer vision models and assess their performance, we rely on a relationship with collaborator Dr. David Shean in UW Civil and Environmental Engineering, who has access to Planet Labs data through a NASA Terrestrial Hydrology Program award.

If you're interested in getting access to Planet Labs imagery for research, check out the Planet Education and Research Program.

<h4>NASA Earthdata <img align="right" src="./docs/nasa-logo.conv.png" style="float:right;" height=40> </h4>

Finally, to gain access to the NASA/JPL Airborne Snow Observatory lidar-derived snow depth information, you need an account with NASA Earthdata. Sign up here.

Infrastructure Deployment

To explore this work, and the tutorials herein, you'll need to deploy some cloud infrastructure to do so. This project uses Dockerand Terraform to manage and deploy consistent, reliable cloud infrastructure.

For detailed instructions on this process, view the documentation.

To jump right to the guts of the deployment, here's our Dockerfile and Terraform Resource Definition.

Tutorials

Through support from Earth Science Information Partners, we're happy to be able to provide thorough interactive tutorials for these tools and methods in the form of Jupyter notebooks. You can see these tutorials in the data pipeline folder ./pipeline.

Acknowledgements and Funding Sources

This work wouldn't be possible without the advice and support of Dr. Nicoleta Cristea, Dr. David Shean, Shashank Buhshan, and others.

We gratefully acknowledge financial support from the Earth Science Innovation Partners Lab, the NASA Terrestrial Hydrology Program, the Planet Labs Education and Research Program, and the National Science Foundation.

<!-- <img src="https://www.esipfed.org/wp-content/uploads/2016/12/ESIP-final-logo.png" height=65> <img src="http://www.bu.edu/cs/files/2016/09/NSF-Logo-1efvspb.png" height=70> <img src="https://upload.wikimedia.org/wikipedia/commons/thumb/e/e5/NASA_logo.svg/500px-NASA_logo.svg.png" height=70> <img src="https://upload.wikimedia.org/wikipedia/commons/thumb/f/f3/Planet_Labs_logo.svg/200px-Planet_Labs_logo.svg.png" height=70> --> <img src="/docs/funding-agency-logos.png" height="70px">

Original Proposal

To see the original resarch proposal for this project, now of date, view it here.


<p align="center"> <img src="https://s3-us-west-2.amazonaws.com/uw-s3-cdn/wp-content/uploads/sites/98/2014/09/07214435/Signature_Stacked_Purple_Hex.png" height='60px'> </p>