Awesome
Large-scale Dataset Distillation
This is a collection of our work targeted at large-scale dataset distillation.
SCDD : Self-supervised Compression Method for Dataset Distillation .
CDA (
@TMLR'24
): Dataset Distillation via Curriculum Data Synthesis in Large Data Era.
SRe<sup>2</sup>L (
@NeurIPS'23 spotlight
): Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective.
Citation
@article{yin2023dataset,
title={Dataset Distillation via Curriculum Data Synthesis in Large Data Era},
author={Yin, Zeyuan and Shen, Zhiqiang},
journal={Transactions on Machine Learning Research},
year={2024}
}
@inproceedings{yin2023squeeze,
title={Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective},
author={Yin, Zeyuan and Xing, Eric and Shen, Zhiqiang},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
}