D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Wiley
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/0249608298f5411d806f6feb296e511b |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:0249608298f5411d806f6feb296e511b |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:0249608298f5411d806f6feb296e511b2021-11-17T03:12:43ZD‐BERT: Incorporating dependency‐based attention into BERT for relation extraction2468-232210.1049/cit2.12033https://doaj.org/article/0249608298f5411d806f6feb296e511b2021-12-01T00:00:00Zhttps://doi.org/10.1049/cit2.12033https://doaj.org/toc/2468-2322Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.Yuan HuangZhixing LiWei DengGuoyin WangZhimin LinWileyarticleComputational linguistics. Natural language processingP98-98.5Computer softwareQA76.75-76.765ENCAAI Transactions on Intelligence Technology, Vol 6, Iss 4, Pp 417-425 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Computational linguistics. Natural language processing P98-98.5 Computer software QA76.75-76.765 |
spellingShingle |
Computational linguistics. Natural language processing P98-98.5 Computer software QA76.75-76.765 Yuan Huang Zhixing Li Wei Deng Guoyin Wang Zhimin Lin D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
description |
Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets. |
format |
article |
author |
Yuan Huang Zhixing Li Wei Deng Guoyin Wang Zhimin Lin |
author_facet |
Yuan Huang Zhixing Li Wei Deng Guoyin Wang Zhimin Lin |
author_sort |
Yuan Huang |
title |
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
title_short |
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
title_full |
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
title_fullStr |
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
title_full_unstemmed |
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction |
title_sort |
d‐bert: incorporating dependency‐based attention into bert for relation extraction |
publisher |
Wiley |
publishDate |
2021 |
url |
https://doaj.org/article/0249608298f5411d806f6feb296e511b |
work_keys_str_mv |
AT yuanhuang dbertincorporatingdependencybasedattentionintobertforrelationextraction AT zhixingli dbertincorporatingdependencybasedattentionintobertforrelationextraction AT weideng dbertincorporatingdependencybasedattentionintobertforrelationextraction AT guoyinwang dbertincorporatingdependencybasedattentionintobertforrelationextraction AT zhiminlin dbertincorporatingdependencybasedattentionintobertforrelationextraction |
_version_ |
1718425967071854592 |