An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection
Relation detection plays a crucial role in knowledge base question answering, and it is challenging because of the high variance of relation expression in real-world questions. Traditional relation detection models based on deep learning follow an encoding-comparing paradigm, where the question and...
Guardado en:
Autores principales: | , , , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2018
|
Materias: | |
Acceso en línea: | https://doaj.org/article/a2b8ec9f03314e2cbec6c7ecdee5a5fb |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:a2b8ec9f03314e2cbec6c7ecdee5a5fb |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:a2b8ec9f03314e2cbec6c7ecdee5a5fb2021-11-19T00:02:40ZAn Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection2169-353610.1109/ACCESS.2018.2883304https://doaj.org/article/a2b8ec9f03314e2cbec6c7ecdee5a5fb2018-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/8546730/https://doaj.org/toc/2169-3536Relation detection plays a crucial role in knowledge base question answering, and it is challenging because of the high variance of relation expression in real-world questions. Traditional relation detection models based on deep learning follow an encoding-comparing paradigm, where the question and the candidate relation are represented as vectors to compare their semantic similarity. Max- or average-pooling operation, which is used to compress the sequence of words into fixed-dimensional vectors, becomes the bottleneck of information flow. In this paper, we propose an attention-based word-level interaction model (ABWIM) to alleviate the information loss issue caused by aggregating the sequence into a fixed-dimensional vector before the comparison. First, attention mechanism is adopted to learn the soft alignments between words from the question and the relation. Then, fine-grained comparisons are performed on the aligned words. Finally, the comparison results are merged with a simple recurrent layer to estimate the semantic similarity. Besides, a dynamic sample selection strategy is proposed to accelerate the training procedure without decreasing the performance. Experimental results of relation detection on both SimpleQuestions and WebQuestions datasets show that ABWIM achieves the state-of-the-art accuracy, demonstrating its effectiveness.Hongzhi ZhangGuandong XuXiao LiangGuangluan XuFeng LiKun FuLei WangTinglei HuangIEEEarticleRelation detectionknowledge base question answeringword-level interactionattentiondynamic sample selectionElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 6, Pp 75429-75441 (2018) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Relation detection knowledge base question answering word-level interaction attention dynamic sample selection Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Relation detection knowledge base question answering word-level interaction attention dynamic sample selection Electrical engineering. Electronics. Nuclear engineering TK1-9971 Hongzhi Zhang Guandong Xu Xiao Liang Guangluan Xu Feng Li Kun Fu Lei Wang Tinglei Huang An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
description |
Relation detection plays a crucial role in knowledge base question answering, and it is challenging because of the high variance of relation expression in real-world questions. Traditional relation detection models based on deep learning follow an encoding-comparing paradigm, where the question and the candidate relation are represented as vectors to compare their semantic similarity. Max- or average-pooling operation, which is used to compress the sequence of words into fixed-dimensional vectors, becomes the bottleneck of information flow. In this paper, we propose an attention-based word-level interaction model (ABWIM) to alleviate the information loss issue caused by aggregating the sequence into a fixed-dimensional vector before the comparison. First, attention mechanism is adopted to learn the soft alignments between words from the question and the relation. Then, fine-grained comparisons are performed on the aligned words. Finally, the comparison results are merged with a simple recurrent layer to estimate the semantic similarity. Besides, a dynamic sample selection strategy is proposed to accelerate the training procedure without decreasing the performance. Experimental results of relation detection on both SimpleQuestions and WebQuestions datasets show that ABWIM achieves the state-of-the-art accuracy, demonstrating its effectiveness. |
format |
article |
author |
Hongzhi Zhang Guandong Xu Xiao Liang Guangluan Xu Feng Li Kun Fu Lei Wang Tinglei Huang |
author_facet |
Hongzhi Zhang Guandong Xu Xiao Liang Guangluan Xu Feng Li Kun Fu Lei Wang Tinglei Huang |
author_sort |
Hongzhi Zhang |
title |
An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
title_short |
An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
title_full |
An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
title_fullStr |
An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
title_full_unstemmed |
An Attention-Based Word-Level Interaction Model for Knowledge Base Relation Detection |
title_sort |
attention-based word-level interaction model for knowledge base relation detection |
publisher |
IEEE |
publishDate |
2018 |
url |
https://doaj.org/article/a2b8ec9f03314e2cbec6c7ecdee5a5fb |
work_keys_str_mv |
AT hongzhizhang anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT guandongxu anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT xiaoliang anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT guangluanxu anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT fengli anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT kunfu anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT leiwang anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT tingleihuang anattentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT hongzhizhang attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT guandongxu attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT xiaoliang attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT guangluanxu attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT fengli attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT kunfu attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT leiwang attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection AT tingleihuang attentionbasedwordlevelinteractionmodelforknowledgebaserelationdetection |
_version_ |
1718420685121912832 |