Mobile QR Code
Title A Study on Zero Anaphora Resolution using Pointer Network
Authors 편집부(Editor)
DOI https://doi.org/10.5573/ieie.2024.61.1.99
Page pp.99-107
ISSN 2287-5026
Keywords Zero-anaphora resolution (ZAR); Attention; Deep learning; Transfer learning; Language representation model
Abstract To communicate smoothly, the exact meaning must be expressed in sentences with complete arguments and predicates. However, omission of required qualifiers occurs frequently in many languages, including Korean. Zero anaphora is a phenomenon in which even the essential substitute for a predicate is omitted in a sentence. Research on the zero anaphora resolution for deep learning models is being actively conducted in Korean, Japanese, and Chinese. In this paper, we attempted to restore the essential qualifications of omitted predicates using a deep learning model that combines Pointer Network and FastText. As a result of performance comparison with various deep learning models, high accuracy of up to 21% was shown.