Zero-resource translation with multi-lingual neural machine translation

Orhan Firat, Baskaran Sankaran, Yaser Al-Onaizan, Fatos T. Yarman Vural, Kyunghyun Cho

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

Original languageEnglish (US)
Title of host publicationEMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages268-277
Number of pages10
ISBN (Electronic)9781945626258
StatePublished - Jan 1 2016
Event2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016 - Austin, United States
Duration: Nov 1 2016Nov 5 2016

Publication series

NameEMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings

Conference

Conference2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016
CountryUnited States
CityAustin
Period11/1/1611/5/16

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems
  • Computational Theory and Mathematics

Cite this

Firat, O., Sankaran, B., Al-Onaizan, Y., Yarman Vural, F. T., & Cho, K. (2016). Zero-resource translation with multi-lingual neural machine translation. In EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 268-277). (EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings). Association for Computational Linguistics (ACL).

Zero-resource translation with multi-lingual neural machine translation. / Firat, Orhan; Sankaran, Baskaran; Al-Onaizan, Yaser; Yarman Vural, Fatos T.; Cho, Kyunghyun.

EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings. Association for Computational Linguistics (ACL), 2016. p. 268-277 (EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Firat, O, Sankaran, B, Al-Onaizan, Y, Yarman Vural, FT & Cho, K 2016, Zero-resource translation with multi-lingual neural machine translation. in EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings. EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings, Association for Computational Linguistics (ACL), pp. 268-277, 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, Austin, United States, 11/1/16.
Firat O, Sankaran B, Al-Onaizan Y, Yarman Vural FT, Cho K. Zero-resource translation with multi-lingual neural machine translation. In EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings. Association for Computational Linguistics (ACL). 2016. p. 268-277. (EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings).
Firat, Orhan ; Sankaran, Baskaran ; Al-Onaizan, Yaser ; Yarman Vural, Fatos T. ; Cho, Kyunghyun. / Zero-resource translation with multi-lingual neural machine translation. EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings. Association for Computational Linguistics (ACL), 2016. pp. 268-277 (EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings).
@inproceedings{4ffd4ee0e938489aae6b47f88fc3258a,
title = "Zero-resource translation with multi-lingual neural machine translation",
abstract = "In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.",
author = "Orhan Firat and Baskaran Sankaran and Yaser Al-Onaizan and {Yarman Vural}, {Fatos T.} and Kyunghyun Cho",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
series = "EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings",
publisher = "Association for Computational Linguistics (ACL)",
pages = "268--277",
booktitle = "EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings",

}

TY - GEN

T1 - Zero-resource translation with multi-lingual neural machine translation

AU - Firat, Orhan

AU - Sankaran, Baskaran

AU - Al-Onaizan, Yaser

AU - Yarman Vural, Fatos T.

AU - Cho, Kyunghyun

PY - 2016/1/1

Y1 - 2016/1/1

N2 - In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

AB - In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

UR - http://www.scopus.com/inward/record.url?scp=85024095465&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85024095465&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85024095465

T3 - EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings

SP - 268

EP - 277

BT - EMNLP 2016 - Conference on Empirical Methods in Natural Language Processing, Proceedings

PB - Association for Computational Linguistics (ACL)

ER -