Relation classification via BERT with piecewise convolution and focal loss.

Recent relation extraction models' architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance de...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Jianyi Liu, Xi Duan, Ru Zhang, Youqiang Sun, Lei Guan, Bingjie Lin
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2021
Materias:
R
Q
Acceso en línea:https://doaj.org/article/fd6687a9c1c049d39116f5b7d51f5b31
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:fd6687a9c1c049d39116f5b7d51f5b31
record_format dspace
spelling oai:doaj.org-article:fd6687a9c1c049d39116f5b7d51f5b312021-12-02T20:08:19ZRelation classification via BERT with piecewise convolution and focal loss.1932-620310.1371/journal.pone.0257092https://doaj.org/article/fd6687a9c1c049d39116f5b7d51f5b312021-01-01T00:00:00Zhttps://doi.org/10.1371/journal.pone.0257092https://doaj.org/toc/1932-6203Recent relation extraction models' architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance dependence problem, the internal semantic information may contain the useful knowledge which can help relation classification. Focus on these problems, this paper proposed a BERT-based relation classification method. Compare with the existing Bert-based architecture, the proposed model can obtain the internal semantic information between entity pair and solve the distance semantic dependence better. The pre-trained BERT model after fine tuning is used in this paper to abstract the semantic representation of sequence, then adopt the piecewise convolution to obtain semantic information which influence the extraction results. Compare with the existing methods, the proposed method can achieve a better accuracy on relational extraction task because of the internal semantic information extracted in the sequence. While, the generalization ability is still a problem that cannot be ignored, and the numbers of the relationships are difference between different categories. In this paper, the focal loss function is adopted to solve this problem by assigning a heavy weight to less number or hard classify categories. Finally, comparing with the existing methods, the F1 metric of the proposed method can reach a superior result 89.95% on the SemEval-2010 Task 8 dataset.Jianyi LiuXi DuanRu ZhangYouqiang SunLei GuanBingjie LinPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 16, Iss 9, p e0257092 (2021)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Jianyi Liu
Xi Duan
Ru Zhang
Youqiang Sun
Lei Guan
Bingjie Lin
Relation classification via BERT with piecewise convolution and focal loss.
description Recent relation extraction models' architecture are evolved from the shallow neural networks to natural language model, such as convolutional neural networks or recurrent neural networks to Bert. However, these methods did not consider the semantic information in the sequence or the distance dependence problem, the internal semantic information may contain the useful knowledge which can help relation classification. Focus on these problems, this paper proposed a BERT-based relation classification method. Compare with the existing Bert-based architecture, the proposed model can obtain the internal semantic information between entity pair and solve the distance semantic dependence better. The pre-trained BERT model after fine tuning is used in this paper to abstract the semantic representation of sequence, then adopt the piecewise convolution to obtain semantic information which influence the extraction results. Compare with the existing methods, the proposed method can achieve a better accuracy on relational extraction task because of the internal semantic information extracted in the sequence. While, the generalization ability is still a problem that cannot be ignored, and the numbers of the relationships are difference between different categories. In this paper, the focal loss function is adopted to solve this problem by assigning a heavy weight to less number or hard classify categories. Finally, comparing with the existing methods, the F1 metric of the proposed method can reach a superior result 89.95% on the SemEval-2010 Task 8 dataset.
format article
author Jianyi Liu
Xi Duan
Ru Zhang
Youqiang Sun
Lei Guan
Bingjie Lin
author_facet Jianyi Liu
Xi Duan
Ru Zhang
Youqiang Sun
Lei Guan
Bingjie Lin
author_sort Jianyi Liu
title Relation classification via BERT with piecewise convolution and focal loss.
title_short Relation classification via BERT with piecewise convolution and focal loss.
title_full Relation classification via BERT with piecewise convolution and focal loss.
title_fullStr Relation classification via BERT with piecewise convolution and focal loss.
title_full_unstemmed Relation classification via BERT with piecewise convolution and focal loss.
title_sort relation classification via bert with piecewise convolution and focal loss.
publisher Public Library of Science (PLoS)
publishDate 2021
url https://doaj.org/article/fd6687a9c1c049d39116f5b7d51f5b31
work_keys_str_mv AT jianyiliu relationclassificationviabertwithpiecewiseconvolutionandfocalloss
AT xiduan relationclassificationviabertwithpiecewiseconvolutionandfocalloss
AT ruzhang relationclassificationviabertwithpiecewiseconvolutionandfocalloss
AT youqiangsun relationclassificationviabertwithpiecewiseconvolutionandfocalloss
AT leiguan relationclassificationviabertwithpiecewiseconvolutionandfocalloss
AT bingjielin relationclassificationviabertwithpiecewiseconvolutionandfocalloss
_version_ 1718375220504428544