Learning to parse and translate improves neural machine translation

Akiko Eriguchi, Yoshimasa Tsuruoka, Kyunghyun Cho

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

There has been relatively little attention to incorporating linguistic prior to neural machine translation. Much of the previous work was further constrained to considering linguistic prior on the source side. In this paper, we propose a hybrid model, called NMT+RNNG, that learns to parse and translate by combining the recurrent neural network grammar into the attention-based neural machine translation. Our approach encourages the neural machine translation model to incorporate linguistic prior during training, and lets it translate on its own afterward. Extensive experiments with four language pairs show the effectiveness of the proposed NMT+RNNG.

Original languageEnglish (US)
Title of host publicationACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers)
PublisherAssociation for Computational Linguistics (ACL)
Pages72-78
Number of pages7
Volume2
ISBN (Electronic)9781945626760
DOIs
StatePublished - Jan 1 2017
Event55th Annual Meeting of the Association for Computational Linguistics, ACL 2017 - Vancouver, Canada
Duration: Jul 30 2017Aug 4 2017

Other

Other55th Annual Meeting of the Association for Computational Linguistics, ACL 2017
CountryCanada
CityVancouver
Period7/30/178/4/17

Fingerprint

Linguistics
linguistics
learning
Recurrent neural networks
neural network
grammar
experiment
language
Machine Translation
Experiments
Recurrent Neural Networks
Grammar
Language
Hybrid Model
Experiment

ASJC Scopus subject areas

  • Language and Linguistics
  • Artificial Intelligence
  • Software
  • Linguistics and Language

Cite this

Eriguchi, A., Tsuruoka, Y., & Cho, K. (2017). Learning to parse and translate improves neural machine translation. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers) (Vol. 2, pp. 72-78). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/P17-2012

Learning to parse and translate improves neural machine translation. / Eriguchi, Akiko; Tsuruoka, Yoshimasa; Cho, Kyunghyun.

ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers). Vol. 2 Association for Computational Linguistics (ACL), 2017. p. 72-78.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Eriguchi, A, Tsuruoka, Y & Cho, K 2017, Learning to parse and translate improves neural machine translation. in ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers). vol. 2, Association for Computational Linguistics (ACL), pp. 72-78, 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, Vancouver, Canada, 7/30/17. https://doi.org/10.18653/v1/P17-2012
Eriguchi A, Tsuruoka Y, Cho K. Learning to parse and translate improves neural machine translation. In ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers). Vol. 2. Association for Computational Linguistics (ACL). 2017. p. 72-78 https://doi.org/10.18653/v1/P17-2012
Eriguchi, Akiko ; Tsuruoka, Yoshimasa ; Cho, Kyunghyun. / Learning to parse and translate improves neural machine translation. ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers). Vol. 2 Association for Computational Linguistics (ACL), 2017. pp. 72-78
@inproceedings{b4b945629c514edcbe51b42a59481195,
title = "Learning to parse and translate improves neural machine translation",
abstract = "There has been relatively little attention to incorporating linguistic prior to neural machine translation. Much of the previous work was further constrained to considering linguistic prior on the source side. In this paper, we propose a hybrid model, called NMT+RNNG, that learns to parse and translate by combining the recurrent neural network grammar into the attention-based neural machine translation. Our approach encourages the neural machine translation model to incorporate linguistic prior during training, and lets it translate on its own afterward. Extensive experiments with four language pairs show the effectiveness of the proposed NMT+RNNG.",
author = "Akiko Eriguchi and Yoshimasa Tsuruoka and Kyunghyun Cho",
year = "2017",
month = "1",
day = "1",
doi = "10.18653/v1/P17-2012",
language = "English (US)",
volume = "2",
pages = "72--78",
booktitle = "ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers)",
publisher = "Association for Computational Linguistics (ACL)",

}

TY - GEN

T1 - Learning to parse and translate improves neural machine translation

AU - Eriguchi, Akiko

AU - Tsuruoka, Yoshimasa

AU - Cho, Kyunghyun

PY - 2017/1/1

Y1 - 2017/1/1

N2 - There has been relatively little attention to incorporating linguistic prior to neural machine translation. Much of the previous work was further constrained to considering linguistic prior on the source side. In this paper, we propose a hybrid model, called NMT+RNNG, that learns to parse and translate by combining the recurrent neural network grammar into the attention-based neural machine translation. Our approach encourages the neural machine translation model to incorporate linguistic prior during training, and lets it translate on its own afterward. Extensive experiments with four language pairs show the effectiveness of the proposed NMT+RNNG.

AB - There has been relatively little attention to incorporating linguistic prior to neural machine translation. Much of the previous work was further constrained to considering linguistic prior on the source side. In this paper, we propose a hybrid model, called NMT+RNNG, that learns to parse and translate by combining the recurrent neural network grammar into the attention-based neural machine translation. Our approach encourages the neural machine translation model to incorporate linguistic prior during training, and lets it translate on its own afterward. Extensive experiments with four language pairs show the effectiveness of the proposed NMT+RNNG.

UR - http://www.scopus.com/inward/record.url?scp=85040614869&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85040614869&partnerID=8YFLogxK

U2 - 10.18653/v1/P17-2012

DO - 10.18653/v1/P17-2012

M3 - Conference contribution

AN - SCOPUS:85040614869

VL - 2

SP - 72

EP - 78

BT - ACL 2017 - 55th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Short Papers)

PB - Association for Computational Linguistics (ACL)

ER -