논문명 | Normalizing Mutual Information for Robust Adaptive Training for Translation |
---|---|
개최일 | 2022.12.07 |
학술회의명 | The 2022 Conference on Empirical Methods in Natural Language Processing |
책임교수 | |
구분 | 구두발표 |
제1저자 | Youngwon Lee |
교신저자 | Seung-won Hwang |
공동저자 | Changmin Lee, Hojin Lee, Seung-won Hwang |
국내/국외 | 국외 |
개최국가 | NA |
주관기관 | |
ion models, tensions between fluency of optimizing target language modeling and sourcefaithfulness remain as challenges. Previously, Conditional Bilingual Mutual Information (CBMI), a scoring metric for the importance of target sentences and tokens, was proposed to encourage fluent and faithful translations. The score is obtained by combining the probability from the translation model and the target language model, which is then used to assign different ults from En-De, De-En, and En-Ro translation taskso |