Classroom-Inspired Knowledge Distillation

sellConference Paper sellIntelliSys sellKnowledge Distillation sellPeer Ranking sellAdaptive Teaching sellImage Classification

Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies

Abstract

TL;DR

  1. We introduce ClassroomKD, a novel knowledge distillation framework inspired by classroom environments.
  2. Our framework comprises two main modules: the Knowledge Filtering (KF) Module and the Mentoring Module.
  3. Extensive experiments on CIFAR-100, ImageNet, and COCO Keypoints demonstrate that our approach significantly improves student model performance.
  4. Our results suggest that students learn more effectively in a diverse classroom with strategically chosen feedback and adaptive teaching.