• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid
Title Performance Enhancement of Segmentation Using Single-Model-Based Knowledge Distillation
Authors 김민규(Mingyu Kim) ; 김경수(Gyeongsu Kim) ; 서기성(Kisung Seo)
DOI https://doi.org/10.5370/KIEE.2025.74.9.1575
Page pp.1575-1580
ISSN 1975-8359
Keywords Deep learning; Segmentation; Knowledge Distillation; Single-Model-Based KD; Self-KD; Mutual-KD
Abstract Image segmentation aims to classify each pixel into a specific category, separating object regions from the background, and is widely used in applications such as autonomous driving and medical imaging. Popular models include PSPNet and DeepLab V3, with many studies focusing on performance enhancement. Knowledge distillation (KD), where a student model learns from a teacher model’s output distribution, has been applied to reduce model size or improve performance. However, most KD methods rely on large, complex teacher models and are difficult to apply when the teacher and student architectures differ. Moreover, in segmentation tasks, KD is more challenging due to the lack of standardized methods and the need to consider task-specific characteristics. To address these issues, we propose applying two single-model-based KD methods to segmentation: Self-Knowledge Distillation (Self-KD), which enables distillation within a single model, and Mutual-Knowledge Distillation (Mutual-KD), where two identical models exchange knowledge. We integrate both methods into the PSPNet architecture and validate their effectiveness on the Pascal VOC 2012 dataset, achieving mIoU improvements of 0.56 percentage points with Self-KD and 1.06 percentage points with Mutual-KD compared to the baseline.