Awesome
Distilling the Undistillable: Learning from a Nasty Teacher
Official Code Repository for "Distilling the Undistillable: Learning from a Nasty Teacher" Surgan Jandial, Yash Khasbage, Arghya Pal, Vineeth N Balasubramanian, and Balaji Krishnamurthy. In ECCV 2022
Overview
-
Recent work Nasty Teacher proposed to develop teachers which can not be distilled or imitated by models attacking it.
-
We examine its defense and demonstrate successful extraction of information even in its presence. Specifically, we analyze Nasty Teacher from two different directions and subsequently leverage them carefully to develop simple yet efficient methodologies, named as HTC and SCM, which increase the learning from Nasty Teacher by upto 68.63% on standard datasets.
We acknowledge that there can be undesirable applications of this work, and thus request you to fill this form for code access.