Home

Awesome

Crop type mapping from optical and radar time series using attention-based deep learning

This code extents the pytorch implementation of the PSE-TSA deep learning architecture to accomodate different forms of multi-sensor fusion.

Requirements

Satellite data preparation

Folder structure

The root folder should contain Sentinel-1 and Sentinel-2 directory named s1_data and s2_data . Their sub-directories must be similar to the structure in the figure below <img src="img/folder_structure.PNG" alt="folder structure" width="500">

Crop type labels

Reference data (Registre parcellaire graphique (RPG)) is obtained from French open data platform. A total of 20 agricultural land use are distributed within the study area, Finistère. The following steps are applied to derive analysis ready crop type labels;

In the end, 12 classes are retained namely; [maize, wheat, barley, rapeseed, protein crops, gel (frozen surfaces), fodder, pasture and moor, meadows, orchards, vegetables/flowers and other cereals]

Their corresponding labels are provided as a list of sub-classes in single_sensor/train.py and multi_sensor/train_fusion.pyto be considered for classification.

Running main experiments


# single sensor (Sentinel-1)
train.py --dataset_folder /s1_data/Chateaulin --dataset_folder2 /s1_data/Quimper --val_folder /s1_data/Morlaix --test_folder /s1_data/Brest --epochs 100 --rdm_seed 1 --sensor S1 --input_dim 2 --mlp1 [2,32,64] --num_classes 12 --res_dir /output_dir

# single sensor (Sentinel-2)
train.py --dataset_folder /s2_data/Chateaulin --dataset_folder2 /s2_data/Quimper --val_folder /s2_data/Morlaix --test_folder /s2_data/Brest --epochs 100 --rdm_seed 1 --sensor S2 --input_dim 10 --mlp1 [10,32,64] --num_classes 12 --minimum_sampling 27 --res_dir /output_dir

# multi-sensor (early fusion)
train_fusion.py --dataset_folder /s1_data/Chateaulin --dataset_folder2 /s1_data/Quimper --val_folder /s1_data/Morlaix --test_folder /s1_data/Brest --fusion_type early --minimum_sampling 27 --interpolate_method nn --epochs 100 --rdm_seed 1 --input_dim 2 --mlp1 [2,32,64] --num_classes 12 --res_dir /output_dir

"""
for multi-sensor, Sentinel-1 data directory (s1_data) is modified as (s2_data) in the dataset.py script to load Sentinel-2 data. Additionally, input_dim and mlp1-4 are handled within multi_sensor/models/stclassifier_fusion.py
"""

Types of fusion

<img src="img/fusion.gif" alt="fusion diagram" width="500">

Results

Quantitative results from single and multi-sensor experiments are available in the results folder/

Credits

Reference

Please cite the following paper if you use any part of the code

@article{article,
author = {Ofori-Ampofo, Stella and Pelletier, Charlotte and Lang, Stefan},
year = {2021},
month = {11},
pages = {4668},
title = {Crop Type Mapping from Optical and Radar Time Series Using Attention-Based Deep Learning},
volume = {13},
journal = {Remote Sensing},
doi = {10.3390/rs13224668}
}

Contributors