Improving Text-to-Code Generation with Features of Code Graph on GPT-2
Code generation, as a very hot application area of deep learning models for text, consists of two different fields: code-to-code and text-to-code. A recent approach, GraphCodeBERT uses code graph, which is called data flow, and showed good performance improvement. The base model architecture of it i...
Guardado en:
Autores principales: | Incheon Paik, Jun-Wei Wang |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/0dfd2a59b1ae40249d53cb816a316f1d |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Investigation of Pre-Trained Bidirectional Encoder Representations from Transformers Checkpoints for Indonesian Abstractive Text Summarization
por: Henry Lucky, et al.
Publicado: (2021) -
Area-Efficient Universal Code Generator for GPS L1C and BDS B1C Signals
por: Jiwoon Park, et al.
Publicado: (2021) -
Aspect-Based Sentiment Analysis in Hindi Language by Ensembling Pre-Trained mBERT Models
por: Abhilash Pathak, et al.
Publicado: (2021) -
A New Hybrid Prime Code for OCDMA Network Multimedia Applications
por: Morsy A. Morsy, et al.
Publicado: (2021) -
A Novel Key Generation Method for Group-Based Physically Unclonable Function Designs
por: Saeed Abdolinezhad, et al.
Publicado: (2021)