• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid
Title Weighted Knowledge Based Knowledge Distillation
Authors 강성재(Sungjae Kang) ; 서기성(Kisung Seo)
DOI https://doi.org/10.5370/KIEE.2022.71.2.431
Page pp.431-435
ISSN 1975-8359
Keywords Deep learning; Knowledge Distillation; Teacher-Student Model. Knowledge Representation; Knowledge Transfer
Abstract We proposed a novel controlling method of knowledge transfer to increase the efficiency of knowledge distillation. Specifically, combining KL divergence and predictive accuracy defines new extended indicator, which improves the efficiency of knowledge representation. Based on this representation, each knowledge is segmented by the extended indicator. We differentiate the weights for each sector so that the loss function can be weighted according to the sector. The results of our experiments successfully demonstrated that the proposed method much further improves the accuracy on CIFAR100, Cub-200-2011, and Tiny ImageNet data for ResNet, MobileNet, and WRN models, compared to the results of HKD and DML. We also showed that this approach is also promising to be plugged in not only the conventional knowledge distillation but also the collaborative knowledge distillation to improve the performance