Large margin nearest neighbor embedding for knowledge representation

Miao Fan, Qiang Zhou, Thomas Fang Zheng, Ralph Grishman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Traditional way of storing facts in triplets (headentity, relation, tail entity), abbreviated as (h, r, t), allows the knowledge to be intuitively displayed and easily acquired by human beings, but hardly computed or even reasoned about by AI machines. Inspired by the success in applying Distributed Representations to AI-related fields, recent studies expect to represent each entity and relation with a unique lowdimensional embedding, which is different from the symbolic and atomic framework of displaying knowledge in triplets. In this way, the knowledge computing and reasoning can be essentially facilitated by means of a simple vector calculation, i.e. h + r ≈ t. We thus contribute an effective model to learn better embeddings satisfying the formula by pulling the positive tail entities t+ together and close to h + r (Nearest Neighbor), and simultaneously pushing the negatives t- away from the positives t+ via keeping a Large Margin. We also design a corresponding learning algorithm to efficiently find the optimal solution based on Stochastic Gradient Descent in iterative fashion. Quantitative experiments illustrate that our approach can achieve the state-of-the-art performance, compared with several recent methods on some benchmark datasets for two classical applications, i.e. Link prediction and Triplet classification. Moreover, we analyze the parameter complexities among all the evaluated models, and analytical results indicate that our model needs fewer computational resources while outperforming the other methods.

Original languageEnglish (US)
Title of host publicationProceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages53-59
Number of pages7
Volume1
ISBN (Electronic)9781467396172
DOIs
StatePublished - Feb 2 2016
Event2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology Workshops, WI-IAT Workshops 2015 - Singapore, Singapore
Duration: Dec 6 2015Dec 9 2015

Other

Other2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology Workshops, WI-IAT Workshops 2015
CountrySingapore
CitySingapore
Period12/6/1512/9/15

Fingerprint

Knowledge representation
Learning algorithms
Experiments

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Software

Cite this

Fan, M., Zhou, Q., Zheng, T. F., & Grishman, R. (2016). Large margin nearest neighbor embedding for knowledge representation. In Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015 (Vol. 1, pp. 53-59). [7396779] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/WI-IAT.2015.125

Large margin nearest neighbor embedding for knowledge representation. / Fan, Miao; Zhou, Qiang; Zheng, Thomas Fang; Grishman, Ralph.

Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015. Vol. 1 Institute of Electrical and Electronics Engineers Inc., 2016. p. 53-59 7396779.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Fan, M, Zhou, Q, Zheng, TF & Grishman, R 2016, Large margin nearest neighbor embedding for knowledge representation. in Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015. vol. 1, 7396779, Institute of Electrical and Electronics Engineers Inc., pp. 53-59, 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology Workshops, WI-IAT Workshops 2015, Singapore, Singapore, 12/6/15. https://doi.org/10.1109/WI-IAT.2015.125
Fan M, Zhou Q, Zheng TF, Grishman R. Large margin nearest neighbor embedding for knowledge representation. In Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015. Vol. 1. Institute of Electrical and Electronics Engineers Inc. 2016. p. 53-59. 7396779 https://doi.org/10.1109/WI-IAT.2015.125
Fan, Miao ; Zhou, Qiang ; Zheng, Thomas Fang ; Grishman, Ralph. / Large margin nearest neighbor embedding for knowledge representation. Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015. Vol. 1 Institute of Electrical and Electronics Engineers Inc., 2016. pp. 53-59
@inproceedings{ca5625ee97354bff93d6e46ef7be40d5,
title = "Large margin nearest neighbor embedding for knowledge representation",
abstract = "Traditional way of storing facts in triplets (headentity, relation, tail entity), abbreviated as (h, r, t), allows the knowledge to be intuitively displayed and easily acquired by human beings, but hardly computed or even reasoned about by AI machines. Inspired by the success in applying Distributed Representations to AI-related fields, recent studies expect to represent each entity and relation with a unique lowdimensional embedding, which is different from the symbolic and atomic framework of displaying knowledge in triplets. In this way, the knowledge computing and reasoning can be essentially facilitated by means of a simple vector calculation, i.e. h + r ≈ t. We thus contribute an effective model to learn better embeddings satisfying the formula by pulling the positive tail entities t+ together and close to h + r (Nearest Neighbor), and simultaneously pushing the negatives t- away from the positives t+ via keeping a Large Margin. We also design a corresponding learning algorithm to efficiently find the optimal solution based on Stochastic Gradient Descent in iterative fashion. Quantitative experiments illustrate that our approach can achieve the state-of-the-art performance, compared with several recent methods on some benchmark datasets for two classical applications, i.e. Link prediction and Triplet classification. Moreover, we analyze the parameter complexities among all the evaluated models, and analytical results indicate that our model needs fewer computational resources while outperforming the other methods.",
author = "Miao Fan and Qiang Zhou and Zheng, {Thomas Fang} and Ralph Grishman",
year = "2016",
month = "2",
day = "2",
doi = "10.1109/WI-IAT.2015.125",
language = "English (US)",
volume = "1",
pages = "53--59",
booktitle = "Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Large margin nearest neighbor embedding for knowledge representation

AU - Fan, Miao

AU - Zhou, Qiang

AU - Zheng, Thomas Fang

AU - Grishman, Ralph

PY - 2016/2/2

Y1 - 2016/2/2

N2 - Traditional way of storing facts in triplets (headentity, relation, tail entity), abbreviated as (h, r, t), allows the knowledge to be intuitively displayed and easily acquired by human beings, but hardly computed or even reasoned about by AI machines. Inspired by the success in applying Distributed Representations to AI-related fields, recent studies expect to represent each entity and relation with a unique lowdimensional embedding, which is different from the symbolic and atomic framework of displaying knowledge in triplets. In this way, the knowledge computing and reasoning can be essentially facilitated by means of a simple vector calculation, i.e. h + r ≈ t. We thus contribute an effective model to learn better embeddings satisfying the formula by pulling the positive tail entities t+ together and close to h + r (Nearest Neighbor), and simultaneously pushing the negatives t- away from the positives t+ via keeping a Large Margin. We also design a corresponding learning algorithm to efficiently find the optimal solution based on Stochastic Gradient Descent in iterative fashion. Quantitative experiments illustrate that our approach can achieve the state-of-the-art performance, compared with several recent methods on some benchmark datasets for two classical applications, i.e. Link prediction and Triplet classification. Moreover, we analyze the parameter complexities among all the evaluated models, and analytical results indicate that our model needs fewer computational resources while outperforming the other methods.

AB - Traditional way of storing facts in triplets (headentity, relation, tail entity), abbreviated as (h, r, t), allows the knowledge to be intuitively displayed and easily acquired by human beings, but hardly computed or even reasoned about by AI machines. Inspired by the success in applying Distributed Representations to AI-related fields, recent studies expect to represent each entity and relation with a unique lowdimensional embedding, which is different from the symbolic and atomic framework of displaying knowledge in triplets. In this way, the knowledge computing and reasoning can be essentially facilitated by means of a simple vector calculation, i.e. h + r ≈ t. We thus contribute an effective model to learn better embeddings satisfying the formula by pulling the positive tail entities t+ together and close to h + r (Nearest Neighbor), and simultaneously pushing the negatives t- away from the positives t+ via keeping a Large Margin. We also design a corresponding learning algorithm to efficiently find the optimal solution based on Stochastic Gradient Descent in iterative fashion. Quantitative experiments illustrate that our approach can achieve the state-of-the-art performance, compared with several recent methods on some benchmark datasets for two classical applications, i.e. Link prediction and Triplet classification. Moreover, we analyze the parameter complexities among all the evaluated models, and analytical results indicate that our model needs fewer computational resources while outperforming the other methods.

UR - http://www.scopus.com/inward/record.url?scp=85028348338&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028348338&partnerID=8YFLogxK

U2 - 10.1109/WI-IAT.2015.125

DO - 10.1109/WI-IAT.2015.125

M3 - Conference contribution

AN - SCOPUS:85028348338

VL - 1

SP - 53

EP - 59

BT - Proceedings - 2015 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -