Home

Awesome

TGS-Salt

53rd place(top2%) solution for Kaggle TGS Salt Identification Challenge

General

This is a not bad solution to get top2% place without post-processing.
Accodring to the forum, binary empty vs non-empty classifier by Heng and +0.01 LB with snapshot ensembling and cyclic lr by Peter and so on,There are many useful tricks.

My solution

Augmentation:

I used padding image from 101x101 to 128*128, but I did not compared it with just resize. Some said the resize+flip is better than pad+aug. you can check the code heretransform.py

pretrained model

I used the resnet34 pretrained model and se-resnext50 pretraied model and se-resnext101 as the Unet encoder. From the results of experiments, the se-resnext50 pretrained model is the best Unet encoder, but some top kagglers said their best model is resnet34.

scSE and hypercolumn

I used the scSE block and hypercolumn on decoder. It can raise the score a little bit.

deep supervision

binary empty vs non-empty classifier. Deep supervision can help the model converge quickly and increase the LB score.

Loss function

From the results of experiments, train with only lovasz_loss and elu+1 is better than train model with bce in stage#1 and lovasz in stage#2.

LR_Scheduler

SGDR with cycle learing rate.

Other excellent Solutions

1th place by b.e.s.<br> 4th place by SeuTao<br> 5th place by AlexenderLiao<br> 8th place by Igor Krashenyi<br> 9th place by tugstugi<br> 11th place by alexisrozhkov<br> 22nd place by Vishunu<br> 27th place by Roman Vlasov<br> 32nd place by Oleg Yaroshevskyy<br> 43th place by n01z3<br>