Language Representation Models: An Overview

In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Thorben Schomacker, Marina Tropmann-Frick
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/a2b1f93d252a4263ad8616a20bf6b939
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:a2b1f93d252a4263ad8616a20bf6b939
record_format dspace
spelling oai:doaj.org-article:a2b1f93d252a4263ad8616a20bf6b9392021-11-25T17:29:32ZLanguage Representation Models: An Overview10.3390/e231114221099-4300https://doaj.org/article/a2b1f93d252a4263ad8616a20bf6b9392021-10-01T00:00:00Zhttps://www.mdpi.com/1099-4300/23/11/1422https://doaj.org/toc/1099-4300In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have resulted in techniques that have allowed for the practical application of transfer learning in NLP. The advances in the field have been substantial, and the milestone of outperforming human baseline performance based on the general language understanding evaluation has been achieved. This paper implements a targeted literature review to outline, describe, explain, and put into context the crucial techniques that helped achieve this milestone. The research presented here is a targeted review of neural language models that present vital steps towards a general language representation model.Thorben SchomackerMarina Tropmann-FrickMDPI AGarticlenatural language processingneural networkstransformerembeddingsmulti-task learningattention-based modelsScienceQAstrophysicsQB460-466PhysicsQC1-999ENEntropy, Vol 23, Iss 1422, p 1422 (2021)
institution DOAJ
collection DOAJ
language EN
topic natural language processing
neural networks
transformer
embeddings
multi-task learning
attention-based models
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
spellingShingle natural language processing
neural networks
transformer
embeddings
multi-task learning
attention-based models
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
Thorben Schomacker
Marina Tropmann-Frick
Language Representation Models: An Overview
description In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have resulted in techniques that have allowed for the practical application of transfer learning in NLP. The advances in the field have been substantial, and the milestone of outperforming human baseline performance based on the general language understanding evaluation has been achieved. This paper implements a targeted literature review to outline, describe, explain, and put into context the crucial techniques that helped achieve this milestone. The research presented here is a targeted review of neural language models that present vital steps towards a general language representation model.
format article
author Thorben Schomacker
Marina Tropmann-Frick
author_facet Thorben Schomacker
Marina Tropmann-Frick
author_sort Thorben Schomacker
title Language Representation Models: An Overview
title_short Language Representation Models: An Overview
title_full Language Representation Models: An Overview
title_fullStr Language Representation Models: An Overview
title_full_unstemmed Language Representation Models: An Overview
title_sort language representation models: an overview
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/a2b1f93d252a4263ad8616a20bf6b939
work_keys_str_mv AT thorbenschomacker languagerepresentationmodelsanoverview
AT marinatropmannfrick languagerepresentationmodelsanoverview
_version_ 1718412302978383872