Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets
The availability of different pre-trained semantic models has enabled the quick development of machine learning components for downstream applications. However, even if texts are abundant for low-resource languages, there are very few semantic models publicly available. Most of the publicly availabl...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/544f83bbce6749a7907372d31f485747 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:544f83bbce6749a7907372d31f485747 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:544f83bbce6749a7907372d31f4857472021-11-25T17:39:40ZIntroducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets10.3390/fi131102751999-5903https://doaj.org/article/544f83bbce6749a7907372d31f4857472021-10-01T00:00:00Zhttps://www.mdpi.com/1999-5903/13/11/275https://doaj.org/toc/1999-5903The availability of different pre-trained semantic models has enabled the quick development of machine learning components for downstream applications. However, even if texts are abundant for low-resource languages, there are very few semantic models publicly available. Most of the publicly available pre-trained models are usually built as a multilingual version of semantic models that will not fit well with the need for low-resource languages. We introduce different semantic models for Amharic, a morphologically complex Ethio-Semitic language. After we investigate the publicly available pre-trained semantic models, we fine-tune two pre-trained models and train seven new different models. The models include Word2Vec embeddings, distributional thesaurus (DT), BERT-like contextual embeddings, and DT embeddings obtained via network embedding algorithms. Moreover, we employ these models for different NLP tasks and study their impact. We find that newly-trained models perform better than pre-trained multilingual models. Furthermore, models based on contextual embeddings from FLAIR and RoBERTa perform better than word2Vec models for the NER and POS tagging tasks. DT-based network embeddings are suitable for the sentiment classification task. We publicly release all the semantic models, machine learning components, and several benchmark datasets such as NER, POS tagging, sentiment classification, as well as Amharic versions of WordSim353 and SimLex999.Seid Muhie YimamAbinew Ali AyeleGopalakrishnan VenkateshIbrahim GashawChris BiemannMDPI AGarticledatasetsneural networkssemantic modelsAmharic NLPlow-resource languagetext taggingInformation technologyT58.5-58.64ENFuture Internet, Vol 13, Iss 275, p 275 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
datasets neural networks semantic models Amharic NLP low-resource language text tagging Information technology T58.5-58.64 |
spellingShingle |
datasets neural networks semantic models Amharic NLP low-resource language text tagging Information technology T58.5-58.64 Seid Muhie Yimam Abinew Ali Ayele Gopalakrishnan Venkatesh Ibrahim Gashaw Chris Biemann Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
description |
The availability of different pre-trained semantic models has enabled the quick development of machine learning components for downstream applications. However, even if texts are abundant for low-resource languages, there are very few semantic models publicly available. Most of the publicly available pre-trained models are usually built as a multilingual version of semantic models that will not fit well with the need for low-resource languages. We introduce different semantic models for Amharic, a morphologically complex Ethio-Semitic language. After we investigate the publicly available pre-trained semantic models, we fine-tune two pre-trained models and train seven new different models. The models include Word2Vec embeddings, distributional thesaurus (DT), BERT-like contextual embeddings, and DT embeddings obtained via network embedding algorithms. Moreover, we employ these models for different NLP tasks and study their impact. We find that newly-trained models perform better than pre-trained multilingual models. Furthermore, models based on contextual embeddings from FLAIR and RoBERTa perform better than word2Vec models for the NER and POS tagging tasks. DT-based network embeddings are suitable for the sentiment classification task. We publicly release all the semantic models, machine learning components, and several benchmark datasets such as NER, POS tagging, sentiment classification, as well as Amharic versions of WordSim353 and SimLex999. |
format |
article |
author |
Seid Muhie Yimam Abinew Ali Ayele Gopalakrishnan Venkatesh Ibrahim Gashaw Chris Biemann |
author_facet |
Seid Muhie Yimam Abinew Ali Ayele Gopalakrishnan Venkatesh Ibrahim Gashaw Chris Biemann |
author_sort |
Seid Muhie Yimam |
title |
Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
title_short |
Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
title_full |
Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
title_fullStr |
Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
title_full_unstemmed |
Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets |
title_sort |
introducing various semantic models for amharic: experimentation and evaluation with multiple tasks and datasets |
publisher |
MDPI AG |
publishDate |
2021 |
url |
https://doaj.org/article/544f83bbce6749a7907372d31f485747 |
work_keys_str_mv |
AT seidmuhieyimam introducingvarioussemanticmodelsforamharicexperimentationandevaluationwithmultipletasksanddatasets AT abinewaliayele introducingvarioussemanticmodelsforamharicexperimentationandevaluationwithmultipletasksanddatasets AT gopalakrishnanvenkatesh introducingvarioussemanticmodelsforamharicexperimentationandevaluationwithmultipletasksanddatasets AT ibrahimgashaw introducingvarioussemanticmodelsforamharicexperimentationandevaluationwithmultipletasksanddatasets AT chrisbiemann introducingvarioussemanticmodelsforamharicexperimentationandevaluationwithmultipletasksanddatasets |
_version_ |
1718412103628357632 |