Zero-shot transfer learning for event extraction

Lifu Huang, Heng Ji, Kyunghyun Cho, Ido Dagan, Sebastian Riedel, Clare R. Voss

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Most previous supervised event extraction methods have relied on features derived from manual annotations, and thus cannot be applied to new event types without extra annotation effort. We take a fresh look at event extraction and model it as a generic grounding problem: mapping each event mention to a specific type in a target event ontology. We design a transferable architecture of structural and compositional neural networks to jointly represent and map event mentions and types into a shared semantic space. Based on this new framework, we can select, for each event mention, the event type which is semantically closest in this space as its type. By leveraging manual annotations available for a small set of existing event types, our framework can be applied to new unseen event types without additional manual annotations. When tested on 23 unseen event types, this zero-shot framework, without manual annotations, achieves performance comparable to a supervised model trained from 3,000 sentences annotated with 500 event mentions.

Original languageEnglish (US)
Title of host publicationACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)
PublisherAssociation for Computational Linguistics (ACL)
Pages2160-2170
Number of pages11
ISBN (Electronic)9781948087322
StatePublished - Jan 1 2018
Event56th Annual Meeting of the Association for Computational Linguistics, ACL 2018 - Melbourne, Australia
Duration: Jul 15 2018Jul 20 2018

Publication series

NameACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)
Volume1

Conference

Conference56th Annual Meeting of the Association for Computational Linguistics, ACL 2018
CountryAustralia
CityMelbourne
Period7/15/187/20/18

Fingerprint

Electric grounding
Ontology
Semantics
Neural networks

ASJC Scopus subject areas

  • Software
  • Computational Theory and Mathematics

Cite this

Huang, L., Ji, H., Cho, K., Dagan, I., Riedel, S., & Voss, C. R. (2018). Zero-shot transfer learning for event extraction. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (pp. 2160-2170). (ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers); Vol. 1). Association for Computational Linguistics (ACL).

Zero-shot transfer learning for event extraction. / Huang, Lifu; Ji, Heng; Cho, Kyunghyun; Dagan, Ido; Riedel, Sebastian; Voss, Clare R.

ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). Association for Computational Linguistics (ACL), 2018. p. 2160-2170 (ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers); Vol. 1).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Huang, L, Ji, H, Cho, K, Dagan, I, Riedel, S & Voss, CR 2018, Zero-shot transfer learning for event extraction. in ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), vol. 1, Association for Computational Linguistics (ACL), pp. 2160-2170, 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, 7/15/18.
Huang L, Ji H, Cho K, Dagan I, Riedel S, Voss CR. Zero-shot transfer learning for event extraction. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). Association for Computational Linguistics (ACL). 2018. p. 2160-2170. (ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)).
Huang, Lifu ; Ji, Heng ; Cho, Kyunghyun ; Dagan, Ido ; Riedel, Sebastian ; Voss, Clare R. / Zero-shot transfer learning for event extraction. ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). Association for Computational Linguistics (ACL), 2018. pp. 2160-2170 (ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)).
@inproceedings{94bedd9d8bf74ae1b44408daef33c240,
title = "Zero-shot transfer learning for event extraction",
abstract = "Most previous supervised event extraction methods have relied on features derived from manual annotations, and thus cannot be applied to new event types without extra annotation effort. We take a fresh look at event extraction and model it as a generic grounding problem: mapping each event mention to a specific type in a target event ontology. We design a transferable architecture of structural and compositional neural networks to jointly represent and map event mentions and types into a shared semantic space. Based on this new framework, we can select, for each event mention, the event type which is semantically closest in this space as its type. By leveraging manual annotations available for a small set of existing event types, our framework can be applied to new unseen event types without additional manual annotations. When tested on 23 unseen event types, this zero-shot framework, without manual annotations, achieves performance comparable to a supervised model trained from 3,000 sentences annotated with 500 event mentions.",
author = "Lifu Huang and Heng Ji and Kyunghyun Cho and Ido Dagan and Sebastian Riedel and Voss, {Clare R.}",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
series = "ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)",
publisher = "Association for Computational Linguistics (ACL)",
pages = "2160--2170",
booktitle = "ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)",

}

TY - GEN

T1 - Zero-shot transfer learning for event extraction

AU - Huang, Lifu

AU - Ji, Heng

AU - Cho, Kyunghyun

AU - Dagan, Ido

AU - Riedel, Sebastian

AU - Voss, Clare R.

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Most previous supervised event extraction methods have relied on features derived from manual annotations, and thus cannot be applied to new event types without extra annotation effort. We take a fresh look at event extraction and model it as a generic grounding problem: mapping each event mention to a specific type in a target event ontology. We design a transferable architecture of structural and compositional neural networks to jointly represent and map event mentions and types into a shared semantic space. Based on this new framework, we can select, for each event mention, the event type which is semantically closest in this space as its type. By leveraging manual annotations available for a small set of existing event types, our framework can be applied to new unseen event types without additional manual annotations. When tested on 23 unseen event types, this zero-shot framework, without manual annotations, achieves performance comparable to a supervised model trained from 3,000 sentences annotated with 500 event mentions.

AB - Most previous supervised event extraction methods have relied on features derived from manual annotations, and thus cannot be applied to new event types without extra annotation effort. We take a fresh look at event extraction and model it as a generic grounding problem: mapping each event mention to a specific type in a target event ontology. We design a transferable architecture of structural and compositional neural networks to jointly represent and map event mentions and types into a shared semantic space. Based on this new framework, we can select, for each event mention, the event type which is semantically closest in this space as its type. By leveraging manual annotations available for a small set of existing event types, our framework can be applied to new unseen event types without additional manual annotations. When tested on 23 unseen event types, this zero-shot framework, without manual annotations, achieves performance comparable to a supervised model trained from 3,000 sentences annotated with 500 event mentions.

UR - http://www.scopus.com/inward/record.url?scp=85063093327&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063093327&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85063093327

T3 - ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)

SP - 2160

EP - 2170

BT - ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)

PB - Association for Computational Linguistics (ACL)

ER -