Semantic Relation Classification via Bidirectional LSTM Networks with Entity-Aware Attention Using Latent Entity Typing
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, Joohong | - |
dc.contributor.author | Seo, Sangwoo | - |
dc.contributor.author | Choi, Yong Suk | - |
dc.date.accessioned | 2022-07-09T14:27:39Z | - |
dc.date.available | 2022-07-09T14:27:39Z | - |
dc.date.created | 2021-05-12 | - |
dc.date.issued | 2019-06 | - |
dc.identifier.issn | 2073-8994 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/147692 | - |
dc.description.abstract | Classifying semantic relations between entity pairs in sentences is an important task in natural language processing (NLP). Most previous models applied to relation classification rely on high-level lexical and syntactic features obtained by NLP tools such as WordNet, the dependency parser, part-of-speech (POS) tagger, and named entity recognizers (NER). In addition, state-of-the-art neural models based on attention mechanisms do not fully utilize information related to the entity, which may be the most crucial feature for relation classification. To address these issues, we propose a novel end-to-end recurrent neural model that incorporates an entity-aware attention mechanism with a latent entity typing (LET) method. Our model not only effectively utilizes entities and their latent types as features, but also builds word representations by applying self-attention based on symmetrical similarity of a sentence itself. Moreover, the model is interpretable by visualizing applied attention mechanisms. Experimental results obtained with the SemEval-2010 Task 8 dataset, which is one of the most popular relation classification tasks, demonstrate that our model outperforms existing state-of-the-art models without any high-level features. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | MDPI | - |
dc.title | Semantic Relation Classification via Bidirectional LSTM Networks with Entity-Aware Attention Using Latent Entity Typing | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Choi, Yong Suk | - |
dc.identifier.doi | 10.3390/sym11060785 | - |
dc.identifier.scopusid | 2-s2.0-85068073042 | - |
dc.identifier.wosid | 000475703000060 | - |
dc.identifier.bibliographicCitation | SYMMETRY-BASEL, v.11, no.6, pp.1 - 12 | - |
dc.relation.isPartOf | SYMMETRY-BASEL | - |
dc.citation.title | SYMMETRY-BASEL | - |
dc.citation.volume | 11 | - |
dc.citation.number | 6 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 12 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | Y | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Science & Technology - Other Topics | - |
dc.relation.journalWebOfScienceCategory | Multidisciplinary Sciences | - |
dc.subject.keywordAuthor | relation extraction | - |
dc.subject.keywordAuthor | entity-aware attention | - |
dc.subject.keywordAuthor | latent entity typing | - |
dc.subject.keywordAuthor | end-to-end learning | - |
dc.subject.keywordAuthor | visualization | - |
dc.identifier.url | https://www.mdpi.com/2073-8994/11/6/785 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, Korea+82-2-2220-1365
COPYRIGHT © 2021 HANYANG UNIVERSITY.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.