• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid

References

1 
Y. LeCun, Y. Bengio, G. Hinton, 2015, Deep learning, Nature, Vol. 521, pp. 436-444DOI
2 
A. Krizhevsky, I. Sutskever, G. Hinton, 2012, Imagenet classification with deep convolutional neural networks, Advances in neural information processing systems 25,, pp. 1097-1105Google Search
3 
S. Han, H. Mao, W. Dally, 2015, Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding, arXiv preprint arXiv:1510.00149Google Search
4 
G. Hinton, O. Vinyals, J.Dean, 2014, Distilling the knowledge in a neural network, Neural Information Processing Systems (NIPS)Google Search
5 
Y. Zhang, T. Xiang, T. Hospedales, H. Lu, 2018, Deep mutual learning, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4320-4328Google Search
6 
J. Tang, R. Shivanna, Z. Zhao, D. Lin, A. Singh, E. Chi, S. Jain, 2020, Understanding and improving knowledge distillation, arXiv preprint arXiv:2002.03532Google Search
7 
L. Yuan, F. Tay, G. Li, T. Wang, J. Feng, 2020, Revisiting knowledge distillation via label smoothing regularization, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3903-3911Google Search
8 
K. Kang, K. Seo, 2021, Knowledge Distillation Using The Distribution Characteristics of Knowledge, in Proceedings of Information and Control Symposium ICS’2021, pp. 170-171Google Search