| Title |
A Real-time Sign Language Interpretation Application for Emergency Situations |
| Authors |
홍정민(Jeong-Min Hong) ; 강동우(Dong-Woo Kang) ; 배진영(Jin-Young Bae) ; 이강록(Gang-Rok Lee) ; 정재훈(Jaehoon Jeong) ; 이예솝(Yaesop Lee) |
| DOI |
https://doi.org/10.5573/ieie.2026.63.1.81 |
| Keywords |
Assistive technology; Sign language interpretation; Real-time system; LLM |
| Abstract |
This study proposes a real-time Korean Sign Language (KSL) interpretation application designed to support communication for deaf individuals with low Korean literacy during emergency situations. To overcome the known limitations of existing text- and voice-based disaster communication systems, the application provides an intuitive interface. This interface allows a user to capture sign language video via a smartphone camera while also inputting specific contextual information, such as a ‘Category’ (e.g., ‘Fire’, ‘Trauma’) and ‘Context Words’ (e.g., ‘leg’, ‘fracture’). Technologically, a pipeline is proposed where hand and upper-body landmarks, extracted in real-time via MediaPipe, are converted into a comprehensive feature vector. This vector is then fed into a Bi-LSTM-based recognition model to identify key sign language words. The recognized words are subsequently combined with the user-provided context to construct a dynamic prompt for a large language model (LLM), which then generates a natural, coherent sentence for rescue personnel. Experimental results demonstrated the system's performance on the test dataset, which involved classifying 220 distinct sign language words, achieving an Accuracy of 90% and an F1-Score of 89.11% for the sign language recognition model. By mitigating contextual ambiguity, the proposed system enhances the real-time capability and accuracy of communication in emergency situations. This study thus presents a practical, hybrid approach to a known communication barrier and provides a foundation for future work in continuous sign language recognition. |