Sentence Compression Using BERT and Graph Convolutional Networks
Sentence compression is a natural language-processing task that produces a short paraphrase of an input sentence by deleting words from the input sentence while ensuring grammatical correctness and preserving meaningful core information. This study introduces a graph convolutional network (GCN) into...
Guardado en:
Autores principales: | Yo-Han Park, Gyong-Ho Lee, Yong-Seok Choi, Kong-Joo Lee |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/f662c617032640ec80e68ff4dde1c624 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
MSGCN: Multi-Subgraph Based Heterogeneous Graph Convolution Network Embedding
por: Junhui Chen, et al.
Publicado: (2021) -
Spectral-Spatial Offset Graph Convolutional Networks for Hyperspectral Image Classification
por: Minghua Zhang, et al.
Publicado: (2021) -
Deep Graph Convolutional Networks for Accurate Automatic Road Network Selection
por: Jing Zheng, et al.
Publicado: (2021) -
Bert-Enhanced Text Graph Neural Network for Classification
por: Yiping Yang, et al.
Publicado: (2021) -
Joint Trajectory Prediction of Multi-Linkage Robot Based on Graph Convolutional Network
por: Hu Wu, et al.
Publicado: (2020)