Awesome
[Official] Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
This repository is the official implementation of "Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation" paper presented in IJCAI 2021. Thanks to the contributors. [IJCAI2021Poster]
Results
You can reproduce all results in the paper with our code. All results have been described in our paper including Appendix. The results of our experiments are so numerous that it is difficult to post everything here. However, if you experiment several times by modifying the hyperparameter value in the .sh file, you will be able to reproduce all of our analysis.
Contact
Feel free to contact us if you have any questions:)
- Jaehoon Oh: jhoon.oh@kaist.ac.kr
- Taehyon Kim: potter32@kaist.ac.kr
Acknowledgements
This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) [No.2019-0-00075, Artificial Intelligence Graduate School Program (KAIST)] and [No. 2021-0-00907, Development of Adaptive and Lightweight Edge-Collaborative Analysis Technology for Enabling Proactively Immediate Response and Rapid Learning].