Home

Awesome

Deep Learning with Label Noise / Noisy Labels

This repo consists of collection of papers and repos on the topic of deep learning by noisy labels. All methods listed below are briefly explained in the paper Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey. More information about the topic can also be found on the survey.

YearTypeConfRepoTitle
2021MLPtMetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels
2020MLICPRPtMeta Soft Label Generation for Noisy Labels
2020RLLearning Adaptive Loss for Robust Learning with Noisy Labels
2020LNCProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks
2020DPIdentifying Mislabeled Data using the Area Under the Margin Ranking
2020RLimited Gradient Descent: Learning With Noisy Labels
2020NCDual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning
2020LNCTemporal Calibrated Regularization for Robust Noisy Label Learning
2020NCParts-dependent Label Noise: Towards Instance-dependent Label Noise
2020NCClass2Simi: A New Perspective on Learning with Label Noise
2020LNCLearning from Noisy Labels with Noise Modeling Network
2020LNCExpertNet: Adversarial Learning and Recovery Against Noisy Labels
2020RPtEarly-Learning Regularization Prevents Memorization of Noisy Labels
2020LNCProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks
2020SCCVPRPtCombating Noisy Labels by Agreement: A Joint Training Method with Co-Regularization
2020SIWCVPRTfDistilling Effective Supervision from Severe Label Noise
2020NCCVPRTraining Noise-Robust Deep Neural Networks via Meta-Learning
2020LNCCVPRGlobal-Local GCN: Large-Scale Label Noise Cleansing for Face Recognition
2020SIWECCVGraph convolutional networks for learning with few clean and many noisy labels
2020SIWECCVNoiseRank: Unsupervised Label Noise Reduction with Dependence Models
2020RICLRSimple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee
2020RICLRCan Gradient Clipping Mitigate Label Noise?
2020SSLICLRPtDivideMix: Learning with Noisy Labels as Semi-supervised Learning
2020SCAAAISelf-Paced Robust Learning for Leveraging Clean Labels in Noisy Data
2020LNCIJCAILearning with Noise: Improving Distantly-Supervised Fine-grained Entity Typing via Automatic Relabeling
2020SIWIJCAILabel Distribution for Learning with Noisy Labels
2020RLIJCAICan Cross Entropy Loss Be Robust to Label Noise?
2020SCWACVLearning from noisy labels via discrepant collaborative training
2020LNCWACVA novel self-supervised re-labeling approach for training with noisy labels
2020SCICMLSearching to Exploit Memorization Effect in Learning from Corrupted Labels
2020MLICMLSIGUA: Forgetting May Make Learning with Noisy Labels More Robust
2020RICMLPtImproving Generalization by Controlling Label-Noise Information in Neural Network Weights
2020RLICMLNormalized Loss Functions for Deep Learning with Noisy Labels
2020RLICMLPeer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
2020SCICMLBeyond Synthetic Noise: Deep Learning on Controlled Noisy Labels
2020OICMLDeep k-NN for Noisy Labels
2020LNCICMLError-Bounded Correction of Noisy Labels
2020OICMLDoes label smoothing mitigate label noise?
2020DPICMLLearning with Bounded Instance- and Label-dependent Label Noise
2020OICMLTraining Binary Neural Networks through Learning with Noisy Supervision
2019SIWNIPSPtMeta-Weight-Net: Learning an Explicit Mapping For Sample Weighting
2019RLICMLOn Symmetric Losses for Learning from Corrupted Labels
2019OICLRPtSOSELETO: A Unified Approach to Transfer Learning and Training with Noisy Labels
2019LNCICLRAn Energy-Based Framework for Arbitrary Label Noise Correction
2019NCNIPSPtAre Anchor Points Really Indispensable in Label-Noise Learning?
2019ONIPSPtCombinatorial Inference against Label Noise
2019RLNIPSPtL_DMI : A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise
2019OCVPRMetaCleaner: Learning to Hallucinate Clean Representations for Noisy-Labeled Visual Recognition
2019LNCICCVO2U-Net: A Simple Noisy Label Detection Approach for Deep Neural Networks
2019SCICCV*Co-Mining: Deep Face Recognition with Noisy Labels
2019ONLNL: Negative Learning for Noisy Labels
2019RPtUsing Pre-Training Can Improve Model Robustness and Uncertainty
2019SSLRobust Learning Under Label Noise With Iterative Noise-Filtering
2019MLCVPRPtLearning to Learn from Noisy Labeled Data
2019MLPumpout: A Meta Approach for Robustly Training Deep Neural Networks with Noisy Labels
2019RLKerasSymmetric Cross Entropy for Robust Learning with Noisy Labels
2019RLCaffeImproved Mean Absolute Error for Learning Meaningful Patterns from Abnormal Training Data
2019LQACVPRLearning From Noisy Labels By Regularized Estimation Of Annotator Confusion
2019SIWCVPRCaffeNoise-Tolerant Paradigm for Training Face Recognition CNNs
2019SIWICMLPtCombating Label Noise in Deep Learning Using Abstention
2019SIWRobust Learning at Noisy Labeled Medical Images: Applied to Skin Lesion Classification
2019SCICMLKerasUnderstanding and Utilizing Deep Neural Networks Trained with Noisy Labels
2019SCICMLPtHow does Disagreement Help Generalization against Label Corruption?
2019SCCVPRLearning a Deep ConvNet for Multi-label Classification with Partial Labels
2019SCCurriculum Loss: Robust Learning and Generalization against Label Corruption
2019SCSELF: Learning to Filter Noisy Labels with Self-Ensembling
2019LNCCVPRPtGraph Convolutional Label Noise Cleaner: Train a Plug-and-play Action Classifier for Anomaly Detection
2019LNCICCVPhotometric Transformer Networks and Label Adjustment for Breast Density Prediction
2019LNCCVPRPtProbabilistic End-to-end Noise Correction for Learning with Noisy Labels
2019LNCTGRSMatlabHyperspectral image classification in the presence of noisy labels
2019LNCICCVDeep Self-Learning From Noisy Labels
2019NCAAAITfSafeguarded Dynamic Label Regression for Noisy Supervision
2019NCICMLPtUnsupervised Label Noise Modeling and Loss Correction
2018OECCVLearning with Biased Complementary Labels
2018ORobust Determinantal Generative Classifier for Noisy Labels and Adversarial Attacks
2018RICLRKerasDimensionality Driven Learning for Noisy Labels
2018RECCVDeep bilevel learning
2018SSLWACVA semi-supervised two-stage approach to learning from noisy labels
2018MLImproving Multi-Person Pose Estimation using Label Correction
2018RLNIPSGeneralized cross entropy loss for training deep neural networks with noisy labels
2018LQAICLRRepoLearning From Noisy Singly-Labeled Data
2018LQAAAAIDeep learning from crowds
2018SIWCVPRRepoIterative Learning With Open-Set Noisy Labels
2018SIWTfLearning to Reweight Examples for Robust Deep Learning
2018SIWCVPRTfCleannet: Transfer Learning for Scalable Image Classifier Training with Label Noise
2018SIWChoiceNet: Robust Learning by Revealing Output Correlations
2018SIWIEEEMulticlass Learning with Partially Corrupted Labels
2018SCNIPSPtCo-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
2018SCIEEEProgressive Stochastic Learning for Noisy Labels
2018SCECCVSklearnCurriculumnet: Weakly supervised learning from large-scale web images
2018LNCCVPRChainerJoint Optimization Framework for Learning with Noisy Labels
2018LNCTIFSPt, Caffe, TfA light CNN for deep face representation with noisy labels
2018LNCWACVIterative cross learning on noisy labels
2018NCNIPSPtUsing trusted data to train deep networks on labels corrupted by severe noise
2018NCISBITraining a neural network based on unreliable human annotation of medical images
2018NCIEEEDeep learning from noisy image labels with quality embedding
2018NCNIPSTfMasking: A new perspective of noisy supervision
2017OLearning with Auxiliary Less-Noisy Labels
2017RRegularizing neural networks by penalizing confident output distributions
2017RPtmixup: Beyond Empirical Risk Minimization
2017MILCVPRAttend in groups: a weakly-supervised deep learning framework for learning from web data
2017MLICCVLearning from Noisy Labels with Distillation
2017MLAvoiding your teacher's mistakes: Training neural networks with controlled weak supervision
2017MLLearning to Learn from Weak Supervision by Full Supervision
2017RLAAAIRobust Loss Functions under Label Noise for Deep Neural
2017LQAICLRWho Said What: Modeling Individual Labelers Improves Classification
2017LQACVPRLean crowdsourcing: Combining humans and machines in an online system
2017SCNIPSTfDecoupling" when to update" from" how to update"
2017SCNIPSTf*Active bias: Training more accurate neural networks by emphasizing high variance samples
2017SCTfMentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels
2017SCSklearnLearning with confident examples: Rank pruning for robust classification with noisy labels
2017SCNIPSToward Robustness against Label Noise in Training Deep Discriminative Neural Networks
2017LNCIEEESelf-Error-Correcting Convolutional Neural Network for Learning with Noisy Labels
2017LNCIEEEImproving crowdsourced label quality using noise correction
2017LNCFidelity-weighted learning
2017LNCCVPRLearning From Noisy Large-Scale Datasets With Minimal Supervision
2017NCCVPRKerasMaking Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
2017NCICLRKerasTraining Deep Neural-Networks Using a Noise Adaptation Layer
2016EMKBSA robust multi-class AdaBoost algorithm for mislabeled noisy data
2016RCVPRRethinking the inception architecture for computer vision
2016SSLAAAIRobust semi-supervised learning through label aggregation
2016MLNCNoise detection in the Meta-Learning Level
2016RLOn the convergence of a family of robust losses for stochastic gradient descent
2016RLICMLLoss factorization, weakly supervised learning and label noise robustness
2016SIWICLRMatlabAuxiliary image regularization for deep cnns with noisy labels
2016SIWCVPRCaffeSeeing Through the Human Reporting Bias: Visual Classifiers From Noisy Human-Centric Labels
2016SCECCVRepoThe Unreasonable Effectiveness of Noisy Data for Fine-Grained Recognition
2016NCICDMMatlabLearning deep networks from noisy labels with dropout regularization
2016NCCASSPKerasTraining deep neural-networks based on unreliable labels
2015OLearning discriminative reconstructions for unsupervised outlier removal
2015EMRboost: label noise-robust boosting algorithm based on a nonconvex loss function and the numerically stable base learners
2015MILCVPRVisual recognition by learning from web data: A weakly supervised domain generalization approach
2015RLNIPSLearning with symmetric label noise: The importance of being unhinge
2015RLNCMaking risk minimization tolerant to label noise
2015LQADeep classifiers from image tags in the wild
2015SIWTPAMIPtClassification with noisy labels by importance reweighting
2015SCICCVWebsiteWebly supervised learning of convolutional networks
2015NCCVPRCaffeLearning From Massive Noisy Labeled Data for Image Classification
2015NCICLRTraining Convolutional Networks with Noisy Labels
2014RExplaining and harnessing adversarial examples
2014RJMLRDropout: a simple way to prevent neural networks from overfitting
2014SCKerasTraining Deep Neural Networks on Noisy Labels with Bootstrapping
2014NCLearning from Noisy Labels with Deep Neural Networks
2014LQALearning from multiple annotators with varying expertise
2013EMBoosting in the presence of label noise
2013RLNIPSLearning with Noisy Labels
2013RLIEEENoise tolerance under risk minimization
2012EMA noise-detection based AdaBoost algorithm for mislabeled data
2012RLICMLLearning to Label Aerial Images from Noisy Data
2011EMAn empirical comparison of two boosting algorithms on real data sets with artificial class noise
2009LQASupervised learning from multiple experts: whom to trust when everyone lies a bit
2008LQANIPSWhose vote should count more: Optimal integration of labels from labelers of unknown expertise
2006RLJASAConvexity, classification, and risk bounds
2000EMAn experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization

In order to test label-noise-robust algorithms with benchmark datasets (mnist,mnist-fashion,cifar10,cifar100) synthetic noise generation is a necessary step. Following work provides a feature-dependent synthetic noise generation algorithm and pre-generated synthetic noisy labels for mentioned datasets.

List of papers that shed light to label noise phenomenon for deep learning:

TitleYear
Image Classification with Deep Learning in the Presence of Noisy Labels: A Survey2020
Investigating CNNs' Learning Representation Under Label Noise2019
How Do Neural Networks Overcome Label Noise?2018
Deep Learning is Robust to Massive Label Noise2018
A closer look at memorization in deep networks2017
Deep Nets Don't Learn via Memorization2017
On the robustness of convnets to training on noisy labels2017
A study of the effect of different types of noise on the precision of supervised learning techniques2017
Understanding deep learning requires rethinking generalization2016
A comprehensive introduction to label noise2014
Classification in the Presence of Label Noise: a Survey2014
Class noise and supervised learning in medical domains: The effect of feature extraction2006
Class noise vs. attribute noise: A quantitative study2004

List of works under label noise beside classification

TitleYear
Devil is in the Edges: Learning Semantic Boundaries from Noisy Annotations2019
Improving Semantic Segmentation via Video Propagation and Label Relaxation2018
Learning from weak and noisy labels for semantic segmentation2016
Robustness of conditional GANs to noisy labels2018
Label-Noise Robust Generative Adversarial Networks2018
Label-Noise Robust Domain Adaptation2020

Sources on web

Clothing1M is a real-world noisy labeled dataset which is widely used for benchmarking. Below is the test accuracies on this dataset. Note that,clothing1M contains spare 50k clean training data, but most of the methods dont use this data for fair comparison. Therefore, here I only listed methods that do not use extra 50k samples. '?' indicates that given work does not mentipon whether they used 50k clean samples or not.

TitleAccuracy
MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels78.20
Meta Soft Label Generation for Noisy Labels76.02
DivideMix: Learning with Noisy Labels as Semi-supervised Learning?74.76
Cleannet: Transfer Learning for Scalable Image Classifier Training with Label Noise74.69
Deep Self-Learning From Noisy Labels74.45
Limited Gradient Descent: Learning With Noisy Labels74.36
Are Anchor Points Really Indispensable in Label-Noise Learning?74.18
NoiseRank: Unsupervised Label Noise Reduction with Dependence Models73.77
Learning Adaptive Loss for Robust Learning with Noisy Labels73.76
Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting73.72
Probabilistic End-to-end Noise Correction for Learning with Noisy Labels73.49
Learning to Learn from Noisy Labeled Data73.47
Improved Mean Absolute Error for Learning Meaningful Patterns from Abnormal Training Data73.20
Safeguarded Dynamic Label Regression for Noisy Supervision73.07
Temporal Calibrated Regularization for Robust Noisy Label Learning72.54
MetaCleaner: Learning to Hallucinate Clean Representations for Noisy-Labeled Visual Recognition72.50
L_DMI : A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise72.46
Joint Optimization Framework for Learning with Noisy Labels72.23
Error-Bounded Correction of Noisy Labels71.74
Parts-dependent Label Noise: Towards Instance-dependent Label Noise71.67
Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning?71.49
Improving Generalization by Controlling Label-Noise Information in Neural Network Weights71.39
Masking: A new perspective of noisy supervision71.10
Symmetric Cross Entropy for Robust Learning with Noisy Labels71.02
Unsupervised Label Noise Modeling and Loss Correction71.00

Abbreviations for noise types are:

Other abbreviations:

Starred (*) repos means code is unoffical!