Awesome Mixup Methods for Supervised Learning¶
We summarize mixup methods proposed for supervised visual representation learning from two aspects: sample mixup policy and label mixup policy. We are working a survey of mixup methods. Current list is on updating.
Sample Mixup Methods¶
Pre-defined Policies¶
MixUp, [ICLR 2018] [code] mixup: Beyond Empirical Risk Minimization.
AdaMixup, [AAAI 2019] MixUp as Locally Linear Out-Of-Manifold Regularization.
CutMix, [ICCV 2019] [code] CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features.
ManifoldMix, [ICML 2019] [code] Manifold Mixup: Better Representations by Interpolating Hidden States.
FMix, [Arixv 2020] [code] FMix: Enhancing Mixed Sample Data Augmentation.
SmoothMix, [CVPRW 2020] [code] SmoothMix: a Simple Yet Effective Data Augmentation to Train Robust Classifiers.
PatchUp, [Arxiv 2020] [code] PatchUp: A Regularization Technique for Convolutional Neural Networks.
GridMixup, [Pattern Recognition 2021] [code] GridMix: Strong regularization through local context mapping.
SmoothMix, [CVPRW 2020] SmoothMix: A Simple Yet Effective Data Augmentation to Train Robust Classifiers.
ResizeMix, [Arixv 2020] [code] ResizeMix: Mixing Data with Preserved Object Information and True Labels.
FocusMix, [ICTC 2020] Where to Cut and Paste: Data Regularization with Selective Features.
AugMix, [ICLR 2020] [code] AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty.
DJMix, [Arxiv 2021] DJMix: Unsupervised Task-agnostic Augmentation for Improving Robustness.
PixMix, [Arxiv 2021] [code] PixMix: Dreamlike Pictures Comprehensively Improve Safety Measures.
StyleMix, [CVPR 2021] [code] StyleMix: Separating Content and Style for Enhanced Data Augmentation.
MixStyle, [ICLR 2021] [code] Domain Generalization with MixStyle.
MoEx, [CVPR 2021] [code] On Feature Normalization and Data Augmentation.
LocalMix, [AISTATS 2021] Preventing Manifold Intrusion with Locality: Local Mixup.
Saliency-guided Policies¶
SaliencyMix, [ICLR 2021] [code] SaliencyMix: A Saliency Guided Data Augmentation Strategy for Better Regularization.
AttentiveMix, [ICASSP 2020] [code] Attentive CutMix: An Enhanced Data Augmentation Approach for Deep Learning Based Image Classification.
SnapMix, [AAAI 2021] [code] SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data.
AttributeMix, [Arxiv 2020] Attribute Mix: Semantic Data Augmentation for Fine Grained Recognition.
PuzzleMix, [ICML 2020] [code] Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup.
CoMixup, [ICLR 2021] [code] Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity.
SuperMix, [CVPR 2021] [code] SuperMix: Supervising the Mixing Data Augmentation.
PatchMix, [Arxiv 2021] Evolving Image Compositions for Feature Representation Learning.
AutoMix, [Arxiv 2021] [code] Unveiling the Power of Mixup for Stronger Classifiers.
AlignMix, [Arxiv 2021] [code] AlignMix: Improving representation by interpolating aligned features.
SAMix, [Arxiv 2021] [code] Boosting Discriminative Visual Representation Learning with Scenario-Agnostic Mixup.
AutoMix, [ECCV 2020] AutoMix: Mixup Networks for Sample Interpolation via Cooperative Barycenter Learning.
StackMix, [Arxiv 2021] StackMix: A complementary Mix algorithm.
ScoreMix, [Arxiv 2022] ScoreNet: Learning Non-Uniform Attention and Augmentation for Transformer-Based Histopathological Image Classification.
RecursiveMix, [Arxiv 2022] [code] RecursiveMix: Mixed Learning with History.
Label Mixup Methods¶
MixUp, [ICLR 2018] [code] mixup: Beyond Empirical Risk Minimization.
CutMix, [ICCV 2019] [code] CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features.
MetaMixup, [TNNLS 2021] Metamixup: Learning adaptive interpolation policy of mixup with metalearning.
mWH, [Arxiv 2021] [code] Mixup Without Hesitation.
CAMixup, [ICLR 2021] [code] Combining Ensembles and Data Augmentation can Harm your Calibration.
Saliency Grafting, [AAAI 2022] Saliency Grafting: Innocuous Attribution-Guided Mixup with Calibrated Label Mixing.
TransMix, [CVPR 2022] [code] TransMix: Attend to Mix for Vision Transformers.
DecoupleMix, [Arxiv 2022,] [code] Decoupled Mixup for Data-efficient Learning.
Contribution¶
Feel free to send pull requests to add more links! Current contributors include: Siyuan Li (@Lupin1998) and Zicheng Liu (@pone7).