Awesome
<img align="right" src="docs/source/figures/eo-learn-logo.png" alt="" width="300"/>
eo-learn
eo-learn makes extraction of valuable information from satellite imagery easy.
The availability of open Earth observation (EO) data through the Copernicus and Landsat programs represents an unprecedented resource for many EO applications, ranging from ocean and land use and land cover monitoring, disaster control, emergency services and humanitarian relief. Given the large amount of high spatial resolution data at high revisit frequency, techniques able to automatically extract complex patterns in such spatio-temporal data are needed.
eo-learn
is a collection of open source Python packages that have been developed to seamlessly access and process
spatio-temporal image sequences acquired by any satellite fleet in a timely and automatic manner. eo-learn
is
easy to use, it's design modular, and encourages collaboration -- sharing and reusing of specific tasks in a typical
EO-value-extraction workflows, such as cloud masking, image co-registration, feature extraction, classification, etc. Everyone is free
to use any of the available tasks and is encouraged to improve the, develop new ones and share them with the rest of the community.
eo-learn
makes extraction of valuable information from satellite imagery as easy as defining a sequence of operations to be performed on satellite imagery. Image below illustrates a processing chain that maps water in satellite imagery by thresholding the Normalised Difference Water Index in user specified region of interest.
eo-learn
library acts as a bridge between Earth observation/Remote sensing field and Python ecosystem for data science and machine learning. The library is written in Python and uses NumPy arrays to store and handle remote sensing data. Its aim is to make entry easier for non-experts to the field of remote sensing on one hand and bring the state-of-the-art tools for computer vision, machine learning, and deep learning existing in Python ecosystem to remote sensing experts.
Package Overview
eo-learn
package is structured into several modules according to different functionalities. Some modules contain extensions under the extra
subfolder. Those modules typically require additional package dependencies which don't get installed by default, since they are usually very specific to the task.
The modules are:
core
- The main module which implements basic building blocks (EOPatch
,EOTask
andEOWorkflow
) and commonly used functionalities.coregistration
- Tasks which deal with image co-registration.features
- A collection of utilities for extracting data properties and feature manipulation.geometry
- Geometry-related tasks used for transformation and conversion between vector and raster data.io
- Input/output tasks that deal with obtaining data from Sentinel Hub services or saving and loading data locally.mask
- Tasks used for masking of data and calculation of cloud/snow/other masks.ml-tools
- Various tools that can be used before or after the machine learning process.visualization
- Visualization tools for the core elements of eo-learn.
Installation
Requirements
The package requires Python version >=3.8.
Linux
Before installing eo-learn
on Linux it is recommended to install the following system libraries:
sudo apt-get install gcc libgdal-dev graphviz proj-bin libproj-dev libspatialindex-dev
Mac OS
Before installing eo-learn
on Mac OS
it is recommended to install the following system libraries with Homebrew:
brew install graphviz gcc gdal cmake spatialindex proj
Windows
Before installing eo-learn
on Windows it is recommended to install the following packages from Unofficial Windows wheels repository:
gdal
rasterio
shapely
fiona
PyPI distribution
eo-learn
is available on PyPI and can be installed with:
pip install eo-learn
For some modules there are extra dependencies available, related to specific tasks. These were kept separate in order to keep the eo-learn
installation light. You can install these with, e.g.:
pip install "eo-learn[EXTRA]"
pip install "eo-learn[VISUALIZATION]"
The full list (including their descriptions) is available here:
RAY
for installing ray and its dependenciesZARR
for installing the zarr functionality for chunked timestamp saving/loadingEXTRA
for installing interpolation- and clustering-specific dependencies, or for installings2cloudless
in cloud maskingVISUALIZATION
for using plotting libraries and utilitiesFULL
for installing all dependencies described so farDOCS
for developers, dependencies for building documentationDEV
for developers, dependencies for testing and code contribution
Conda Forge distribution
The package requires a Python environment >=3.8.
Thanks to the maintainers of the conda forge feedstock (@benhuff, @dcunn, @mwilson8, @oblute, @rluria14), eo-learn
can
be installed using conda-forge
as follows:
conda config --add channels conda-forge
conda install eo-learn
Run with Docker
A docker image with the latest released version of eo-learn
is available at Docker Hub. It provides a full installation of eo-learn
together with a Jupyter notebook environment. You can pull and run it with:
docker pull sentinelhub/eolearn:latest
docker run -p 8888:8888 sentinelhub/eolearn:latest
An extended version of the latest
image additionally contains all example notebooks and data to get you started with eo-learn
. Run it with:
docker pull sentinelhub/eolearn:latest-examples
docker run -p 8888:8888 sentinelhub/eolearn:latest-examples
Both docker images can also be built manually from GitHub repository:
docker build -f docker/eolearn.dockerfile . --tag=sentinelhub/eolearn:latest
docker build -f docker/eolearn-examples.dockerfile . --tag=sentinelhub/eolearn:latest-examples
Documentation
For more information on the package content, visit readthedocs.
More Examples
Examples and introductions to the package can be found here. A larger collection of examples is available at the eo-learn-examples
repository. While the examples there are not always up-to-date they can be a great source of ideas.
In the past, eo-learn
served as a collection of many useful tasks, originating from various contributors or projects. In order to keep eo-learn
light and easy to maintain, we have decided to move these specific tasks to eo-learn-examples/extra-tasks
, .
Contributions
The list of all eo-learn
contributors are listed in the credits file. If you would like to contribute to eo-learn
, please check our contribution guidelines.
Blog posts and papers
- Introducing eo-learn (by Devis Peressutti)
- Land Cover Classification with eo-learn: Part 1 - Mastering Satellite Image Data in an Open-Source Python Environment (by Matic Lubej)
- Land Cover Classification with eo-learn: Part 2 - Going from Data to Predictions in the Comfort of Your Laptop (by Matic Lubej)
- Land Cover Classification with eo-learn: Part 3 - Pushing Beyond the Point of “Good Enough” (by Matic Lubej)
- Innovations in satellite measurements for development
- Use eo-learn with AWS SageMaker (by Drew Bollinger)
- Spatio-Temporal Deep Learning: An Application to Land Cover Classification (by Anze Zupanc)
- Tree Cover Prediction with Deep Learning (by Daniel Moraite)
- NoRSC19 Workshop on eo-learn
- Tracking a rapidly changing planet (by Development Seed)
- Land Cover Monitoring System (by Jovan Visnjic and Matej Aleksandrov)
- eo-learn Webinar (by Anze Zupanc)
- Cloud Masks at Your Service
- ML examples for Common Agriculture Policy
- High-Level Concept
- Data Handling
- Outlier detection
- Identifying built-up areas
- Similarity Score
- Bare Soil Marker
- Mowing Marker
- Pixel-level Mowing Marker
- Crop Type Marker
- Homogeneity Marker
- Parcel Boundary Detection
- Land Cover Classification (still to come)
- Minimum Agriculture Activity (still to come)
- Combining the Markers into Decisions
- The Challenge of Small Parcels
- Traffic Light System
- Expert Judgement Application
- Scale-up your eo-learn workflow using Batch Processing API (by Maxim Lamare)
Questions and Issues
Feel free to ask questions about the package and its use cases at Sentinel Hub forum or raise an issue on GitHub.
You are welcome to send your feedback to the package authors, EO Research team, through any of Sentinel Hub communication channel.
License
See LICENSE.
Acknowledgements
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreements No. 776115, No. 101004112, No. 101059548 and No. 101086461.