연구성과

국외저널
논문명 Target-Oriented Knowledge Distillation with Language-Family-Based Grouping for Multilingual NMT
게재일 20220630
학술지명 ACM Transactions on Asian and Low-Resource Language Information Processing
책임교수
논문종류 01 SCI
제1저자 도희진
교신저자 이근배
공동저자 이근배
Impact Factor 1.92000
Keyword

Multilingual NMT has been developed rapidly, but still has performance degradation caused by language diversity and model capacity constraints. To achieve the competitive accuracy of multilingual translation despite such limitations, knowledge distillation, which improves the student network by matching the teacher network’s output, has been applied and shown enhancement by focusing on the important parts of the teacher distribution. However, existing knowledge distillation methods for multilingual NMT rarely consider the knowledge, which has an important function as the student model’s target, in the process. In this paper, we propose two distillation strategies that effectively use the knowledge to improve the accuracy of multilingual NMT. First, we introduce a language-family-based approach, guiding to select appropriate knowledge for each language pair. By distilling the knowledge of multilingual teachers that each processes a group of languages classified by language families, the multilingual model overcomes accuracy degradation caused by linguistic diversity. Second, we propose target-oriented knowledge distillation, which intensively focuses on the ground-truth target of knowledge with a penalty strategy. Our method provides a sensible distillation by penalizing samples without actual targets, while additionally targeting the ground-truth targets. Experiments using TED Talk datasets demonstrate the effectiveness of our method with BLEU scores increment. Discussions of distilled knowledge and further observations of the methods also validate our results.

04620 서울특별시 중구 필동로1길 30 동국대학교 Knowledge Science 연구센터(KSRC) Tel.02-2290-1441
Copyright© 2021 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.

×