Relation classification via BERT with piecewise convolution and focal loss.
Recent relation extraction models' architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance de...
Guardado en:
Autores principales: | Jianyi Liu, Xi Duan, Ru Zhang, Youqiang Sun, Lei Guan, Bingjie Lin |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/fd6687a9c1c049d39116f5b7d51f5b31 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Sentence Compression Using BERT and Graph Convolutional Networks
por: Yo-Han Park, et al.
Publicado: (2021) -
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
por: Yuan Huang, et al.
Publicado: (2021) -
Bert-Enhanced Text Graph Neural Network for Classification
por: Yiping Yang, et al.
Publicado: (2021) -
HOPS: A Fast Algorithm for Segmenting Piecewise Polynomials of Arbitrary Orders
por: Junbo Duan, et al.
Publicado: (2021) -
A BERT-Based Approach for Extracting Prerequisite Relations among Wikipedia Concepts
por: Youheng Bai, et al.
Publicado: (2021)