Awesome
awesome-mixup
This repo is a collection of AWESOME things about mixup, including papers, code, etc. Feel free to star and fork. We borrow a lot from openmixup, Awesome-Mixup, awesome-domain-adaptation, and PromptPapers.
Some of these papers are summarized with tables in Google Sheet. Please find the link here: Summary(Restricted)
<!-- 1. **[[]]()** x. x. [[code](x)] *x* -->
Basics
This section contains the exploration on the improvements aspects of raw mixup.
-
[mixup'18] mixup: Beyond Empirical Risk Minimization. ICLR 2018. [code].
Hongyi Zhang, Moustapha Cisse, Yann N. Dauphin, David Lopez-Paz.
-
[Manifold Mixup'19] Manifold Mixup: Better Representations by Interpolating Hidden States. ICML 2019. [code]
Vikas Verma, Alex Lamb, Christopher Beckham, Amir Najafi, Ioannis Mitliagkas, David Lopez-Paz, Yoshua Bengio
-
[AdaMixup'19] MixUp as Locally Linear Out-Of-Manifold Regularization. AAAI 2019.
Hongyu Guo, Yongyi Mao, Richong Zhang.
-
[CutMix'19] CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features. ICCV 2019. [code]
Sangdoo Yun, Dongyoon Han, Seong Joon Oh, Sanghyuk Chun, Junsuk Choe, Youngjoon Yoo.
-
[AugMix'20] AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty. ICLR 2020. [code]
Dan Hendrycks, Norman Mu, Ekin D. Cubuk, Barret Zoph, Justin Gilmer, Balaji Lakshminarayanan.
-
[SnapMix'21] SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data. AAAI 2021. [code]
Shaoli Huang, Xinchao Wang, Dacheng Tao.
-
[PuzzleMix'20] Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup. ICML 2020. [code]
Jang-Hyun Kim, Wonho Choo, Hyun Oh Song.
-
[SaliencyMix'21] SaliencyMix: A Saliency Guided Data Augmentation Strategy for Better Regularization. ICLR 2021. [code]
A F M Shahab Uddin and Mst. Sirazam Monira and Wheemyung Shin and TaeChoong Chung and Sung-Ho Bae.
-
[CoMixup'21] Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity. ICLR 2021. [code]
Jang-Hyun Kim, Wonho Choo, Hosan Jeong, Hyun Oh Song.
-
[NFM'22] Noisy Feature Mixup. ICLR 2022. [code]
Soon Hoe Lim, N. Benjamin Erichson, Francisco Utrera, Winnie Xu, Michael W. Mahoney
-
[AlignMix'22] AlignMix: Improving representation by interpolating aligned features. CVPR 2022. [code]
Shashanka Venkataramanan, Ewa Kijak, Laurent Amsaleg, Yannis Avrithis.
-
[TransMix'22] TransMix: Attend to Mix for Vision Transformers. CVPR 2022. [code]
Jie-Neng Chen, Shuyang Sun, Ju He, Philip Torr, Alan Yuille, Song Bai.
-
[GenLabel'22] GenLabel: Mixup Relabeling using Generative Models. ICML 2022. [code]
Jy-yong Sohn, Liang Shang, Hongxu Chen, Jaekyun Moon, Dimitris Papailiopoulos, Kangwook Lee.
-
[VLMixer'22] VLMixer: Unpaired Vision-Language Pre-training via Cross-Modal CutMix. ICML 2022. [code]
Teng Wang, Wenhao Jiang, Zhichao Lu, Feng Zheng, Ran Cheng, Chengguo Yin, Ping Luo
-
[AutoMix'22] AutoMix: Unveiling the Power of Mixup for Stronger Classifiers. ECCV 2022. [code]
Zicheng Liu, Siyuan Li, Di Wu, Zihan Liu, Zhiyuan Chen, Lirong Wu, Stan Z. Li.
-
[TokenMix'22] TokenMix: Rethinking Image Mixing for Data Augmentation in Vision Transformers. ECCV 2022. [code]
Jihao Liu, Boxiao Liu, Hang Zhou, Hongsheng Li, Yu Liu
-
[MDD'22] Towards Understanding the Data Dependency of Mixup-style Training. ICLR 2022. [code]
Muthu Chidambaram, Xiang Wang, Yuzheng Hu, Chenwei Wu, Rong Ge.
-
[WH-Mixup'22] When and How Mixup Improves Calibration. ICML 2022.
Linjun Zhang, Zhun Deng, Kenji Kawaguchi, James Zou.
-
[RegMixup'22] RegMixup: Mixup as a Regularizer Can Surprisingly Improve Accuracy and Out Distribution Robustness. NeurIPS 2022. [code]
Francesco Pinto, Harry Yang, Ser-Nam Lim, Philip H.S. Torr, Puneet K. Dokania.
-
[RecursiveMix'22] RecursiveMix: Mixed Learning with History. NeurIPS 2022. [code]
Lingfeng Yang, Xiang Li, Borui Zhao, Renjie Song, Jian Yang.
-
[MSDA'22] A Unified Analysis of Mixed Sample Data Augmentation: A Loss Function Perspective. NeurIPS 2022. [code]
Chanwoo Park, Sangdoo Yun, Sanghyuk Chun.
Contrastive Learning with Mixup
-
[MixCo'20] MixCo: Mix-up Contrastive Learning for Visual Representation. NeurIPSW. [code]
MixCo: Sungnyun Kim, Gihun Lee, Sangmin Bae, Se-Young Yun.
-
[MoCHi'20] Hard Negative Mixing for Contrastive Learning. NeurIPS 2020. [code]
Yannis Kalantidis, Mert Bulent Sariyildiz, Noe Pion, Philippe Weinzaepfel, Diane Larlus.
-
[i-Mix'21] i-Mix A Domain-Agnostic Strategy for Contrastive Representation Learning. ICLR 2021. [code]
Kibok Lee, Yian Zhu, Kihyuk Sohn, Chun-Liang Li, Jinwoo Shin, Honglak Lee.
-
[FT'19] Improving Contrastive Learning by Visualizing Feature Transformation. ICCV 2019 (Oral). [code]
Rui Zhu, Bingchen Zhao, Jingen Liu, Zhenglong Sun, Chang Wen Chen.
-
[Core-tuning'21] Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-Tuning. NeurIPS 2021. [code]
Yifan Zhang, Bryan Hooi, Dapeng Hu, Jian Liang, Jiashi Feng.
-
[MixSiam'22] MixSiam: A Mixture-based Approach to Self-supervised Representation Learning. AAAI 2022.
Xiaoyang Guo, Tianhao Zhao, Yutian Lin, Bo Du.
-
[Un-Mix'22] Un-Mix: Rethinking Image Mixtures for Unsupervised Visual Representation. AAAI 2022. [code]
Zhiqiang Shen, Zechun Liu, Zhuang Liu, Marios Savvides, Trevor Darrell, Eric Xing.
-
[Metrix'22] It Takes Two to Tango: Mixup for Deep Metric Learning. ICLR 2022. [code]
Shashanka Venkataramanan, Bill Psomas, Ewa Kijak, Laurent Amsaleg, Konstantinos Karantzalos, Yannis Avrithis.
-
[ProGCL'22] ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning. ICML 2022. [code]
Jun Xia, Lirong Wu, Ge Wang, Jintao Chen, Stan Z.Li.
-
[M-Mix'22] M-Mix: Generating Hard Negatives via Multi-sample Mixing for Contrastive Learning. KDD 2022. [code]
Shaofeng Zhang, Meng Liu, Junchi Yan, Hengrui Zhang, Lingxiao Huang, Pinyan Lu, Xiaokang Yang.
Semi-supervised Learning with Mixup
-
[ICT'19] Interpolation Consistency Training for Semi-Supervised Learning. IJCAI 2019. [code]
Vikas Verma, Kenji Kawaguchi, Alex Lamb, Juho Kannala, Yoshua Bengio, David Lopez-Paz
-
[MixMatch'19] MixMatch: A Holistic Approach to Semi-Supervised Learning. NeurIPS 2019. [code]
David Berthelot, Nicholas Carlini, Ian Goodfellow, Nicolas Papernot, Avital Oliver, Colin Raffel.
-
[P3MIX'22] Who Is Your Right Mixup Partner in Positive and Unlabeled Learning. ICLR 2022. [code]
Changchun Li, Ximing Li, Lei Feng, Jihong Ouyang.
Mixup in NLP
-
[mixup-text'19] Augmenting Data with Mixup for Sentence Classification: An Empirical Study. arXiv 2019. [code]
Hongyu Guo, Yongyi Mao, Richong Zhang.
-
[TMix'20] MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification. ACL 2020. [code]
Jiaao Chen, Zichao Yang, and Diyi Yang.
-
[Mixup-Transformer'20] Mixup-Transformer: Dynamic Data Augmentation for NLP Tasks. COLING 2020.
Lichao Sun, Congying Xia, Wenpeng Yin, Tingting Liang, Philip S. Yu, Lifang He.
-
[AdvAug'20] AdvAug: Robust Adversarial Augmentation for Neural Machine Translation. ACL 2020.
Yong Cheng, Lu Jiang, Wolfgang Macherey, Jacob Eisenstein.
-
[SL'20] Sequence-Level Mixed Sample Data Augmentation. EMNLP 2020.
Demi Guo, Yoon Kim, Alexander Rush.
-
[BRMC'21] Better Robustness by More Coverage: Adversarial and Mixup Data Augmentation for Robust Finetuning. ACL 2021.
Chenglei Si, Zhengyan Zhang, Fanchao Qi, Zhiyuan Liu, Yasheng Wang, Qun Liu, Maosong Sun.
-
[HYPMIX'21] HYPMIX: Hyperbolic Interpolative Data Augmentation. EMNLP 2021. [code]
Ramit Sawhney, Megh Thakkar, Shivam Agarwal, Di Jin, Diyi Yang, Lucie Flek
-
[SSMix'21] SSMix: Saliency-Based Span Mixup for Text Classification. ACL Findings 2021. [code]
Soyoung Yoon, Gyuwan Kim, Kyumin Park
-
[Multilingual Mix'22] Multilingual Mix: Example Interpolation Improves Multilingual Neural Machine Translation. ACL 2022.
Yong Cheng, Ankur Bapna, Orhan Firat, Yuan Cao, Pidong Wang, and Wolfgang Macherey
-
[DMix'22] DMIX: Adaptive Distance-aware Interpolative Mixup. ACL 2022. (SHORT)
Ramit Sawhney, Megh Thakkar, Shrey Pandit, Ritesh Soun, Di Jin, Diyi Yang, Lucie Flek
-
[STEMM'22] STEMM: Self-learning with Speech-text Manifold Mixup for Speech Translation. ACL 2022. [code]
Qingkai Fang, Rong Ye, Lei Li, Yang Feng, Mingxuan Wang.
-
[CsaNMT'22] Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Translation. ACL 2022. [code]
Xiangpeng Wei, Heng Yu, Yue Hu, Rongxiang Weng, Weihua Luo, Jun Xie, Rong Jin.
-
[AUMS'22] On the Calibration of Pre-trained Language Models using Mixup Guided by Area Under the Margin and Saliency. ACL 2022. [code]
Seo Yeon Park and Cornelia Caragea.
-
[XAIMix'22] Explainability-based mix-up approach for text data augmentation. TKDD.
Soonki Kwon , Younghoon Lee.
-
[TreeMix'22] TreeMix: Compositional Constituency-based Data Augmentation for Natural Language Understanding. NAACL 2022. [code]
Le Zhang, Zichao Yang, Diyi Yang.
-
[X-Mixup'22] Enhancing Cross-lingual Transfer by Manifold Mixup. ICLR 2022. [code]
Huiyun Yang, Huadong Chen, Hao Zhou, Lei Li.
Other Application
-
[SMFM'22] Boosting Factorization Machines via Saliency-Guided Mixup. 2022. [code]
Chenwang Wu, Defu Lian, Yong Ge, Min Zhou, Enhong Chen, Dacheng Tao.
-
[MIX-TS] Mixing Up Contrastive Learning: Self-Supervised Representation Learning for Time Series. PRL 2022. [code]
Kristoffer Wickstrøm, Michael Kampffmeyer, Karl Øyvind Mikalsen, Robert Jenssen.