Awesome
MeTAL - Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning (ICCV2021 Oral)
Sungyong Baik, Janghoon Choi, Heewon Kim, Dohee Cho, Jaesik Min, Kyoung Mu Lee
Official PyTorch implementation of Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning (ICCV2021 Oral)
The code is based off the public code of MAML++, where their reimplementation of MAML is used as the baseline. The code also includes the implementation of ALFA.
[Paper-arXiv] [Video]
Requirements
Ubuntu 18.04
- Anaconda3
- Python==3.7.10
- PyTorch==1.4
- numpy==1.19.2
To install requirements, first download Anaconda3 and then run the following:
conda create -n metal python=3.7.10
conda activate metal
bash install.sh
Datasets
For miniIamgenet, the dataset can be downloaded from the link provided from MAML++ public code. make a directory named 'datasets' and place the downloaded miniImagnet under the 'datasets' directory.
Training
To train a model, run the following command in experiments_scripts
directory
bash MeTAL.sh $GPU_ID
Evaluation
After training is finished, evaluation is performed automatically. To run an evaluation manually, run the same command
bash MeTAL.sh $GPU_ID
Results
Model | Backbone | 1-shot Accuracy | 5-shot Accuracy |
---|---|---|---|
MAML | 4-CONV | 49.64 ± 0.31% | 64.99 ± 0.27% |
MeTAL | 4-CONV | 52.63 ± 0.37% | 70.52 ± 0.29% |
ALFA+MAML | 4-CONV | 50.58 ± 0.51% | 69.12 ± 0.47% |
ALFA+MeTAL | 4-CONV | 57.75 ± 0.38% | 74.10 ± 0.43% |
MAML | ResNet12 | 58.60 ± 0.42% | 69.54 ± 0.38% |
MeTAL | ResNet12 | 59.64 ± 0.38% | 76.20 ± 0.19% |
ALFA+MAML | ResNet12 | 59.74 ± 0.49% | 77.96 ± 0.41% |
ALFA+MeTAL | ResNet12 | 66.61 ± 0.28% | 81.43 ± 0.29% |
Reference
@InProceedings{baik2021meta,
title={Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning},
author={Sungyong Baik, Janghoon Choi, Heewon Kim, Dohee Cho, Jaesik Min, Kyoung Mu Lee}
booktitle = {International Conference on Computer Vision (ICCV)},
year={2021}
}