Home

Awesome

The BioMassters

The BioMassters

1st place out of 976 participants with 27.6280 Average RMSE score (top2 27.6779).

Approach

The solution is based on an UNet model with a shared encoder with aggregation via attention. The inputs to the encoder are 15-band images with a resolution of 256x256 from joint Sentinel-1 and Sentinel-2 satellite missions. The encoder is shared for all 12 months. The outputs are aggregated via self-attention. Finally, a decoder takes as inputs the aggregated features and predicts a single yearly agbm. We directly optimize RMSE using AdamW optimizer and CosineAnnelingLR scheduler. We don't compute loss for high agbm values (>400). We use vertical flips, rotations, and random month dropout as augmentations. Month dropout simply removes images.

Highlights

Prerequisites & Hardware

Setup

Create an environment using Python 3.8. The solution was originally run on Python 3.8.10. Install the required Python packages

pip install -r requirements.txt

Download the data from the competition page and unzip into data folder.

Training

To run training from the command line

sh ./run.sh

It will take about 8 days on 2 A100 40GB GPUs.

Inference

Download pretrained models and extract into models folder.

unzip models.zip -d models

To run inference from the command line

sh ./submit.sh

It takes ~5 minutes on 1 GPU A100 40GB (note on V100 32GB the results are slightly different).