D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction

Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Yuan Huang, Zhixing Li, Wei Deng, Guoyin Wang, Zhimin Lin
Formato: article
Lenguaje:EN
Publicado: Wiley 2021
Materias:
Acceso en línea:https://doaj.org/article/0249608298f5411d806f6feb296e511b
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.