Classroom-Inspired Knowledge Distillation

sellArticle sellPreprint sellKnowledge Distillation sellPeer Ranking sellAdaptive Teaching sellImage Classification

Classroom-Inspired Knowledge Distillation with Adaptive Peer Ranking

We introduce ClassroomKD, a novel knowledge distillation framework inspired by classroom environments to enhance the knowledge transfer between the student, the teacher, and multiple diverse peers. The framework comprises two main modules: the Knowledge Filtering (KF) Module and the Mentoring Module. In the KF Module, the student evaluates its performance and selectively seeks feedback from higher-ranked peers and teachers based on their prediction accuracy, filtering effective feedback and minimizing confusion. The Mentoring Module adjusts teaching strategies according to the student's understanding level by dynamically modulating the curriculum temperature, bridging the dynamic capacity gap between the student and the mentors. Extensive experiments on CIFAR-100 demonstrate that our approach significantly improves student model performance. We show improvement over both logit-based and feature-based methods using several networks of the same and different architectural styles. Our results suggest that students learn more effectively in a diverse classroom with strategically chosen feedback and adaptive teaching. This offers a promising future direction in effective knowledge transfer between models.

Read Full-Text

TL;DR
  1. We introduce ClassroomKD, a novel knowledge distillation framework inspired by classroom environments.
  2. Our framework comprises two main modules: the Knowledge Filtering (KF) Module and the Mentoring Module.
  3. Extensive experiments on CIFAR-100 demonstrate that our approach significantly improves student model performance.
  4. Our results suggest that students learn more effectively in a diverse classroom with strategically chosen feedback and adaptive teaching.

How to Cite

If you find this useful, please include the following citation in your work:

@article{sarode2024classroom,
  title={Classroom-Inspired Knowledge Distillation with Adaptive Peer Ranking},
  author={Sarode, Shalini and Khan, Muhammad Saif Ullah and Stricker, Didier and Afzal, Muhammad Zeshan},
  journal={OpenReview Preprint},
  year={2024}
}