D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level...
Guardado en:
Autores principales: | Yuan Huang, Zhixing Li, Wei Deng, Guoyin Wang, Zhimin Lin |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Wiley
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/0249608298f5411d806f6feb296e511b |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Constrained tolerance rough set in incomplete information systems
por: Renxia Wan, et al.
Publicado: (2021) -
An efficient hybrid recommendation model based on collaborative filtering recommender systems
por: Mohammed Fadhel Aljunid, et al.
Publicado: (2021) -
Low‐rank constrained weighted discriminative regression for multi‐view feature learning
por: Chao Zhang, et al.
Publicado: (2021) -
Deep imitation reinforcement learning for self‐driving by vision
por: Qijie Zou, et al.
Publicado: (2021) -
Design and analysis of recurrent neural network models with non‐linear activation functions for solving time‐varying quadratic programming problems
por: Xiaoyan Zhang, et al.
Publicado: (2021)