D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction

Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Yuan Huang, Zhixing Li, Wei Deng, Guoyin Wang, Zhimin Lin
Formato: article
Lenguaje:EN
Publicado: Wiley 2021
Materias:
Acceso en línea:https://doaj.org/article/0249608298f5411d806f6feb296e511b
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!