Visualization and categorization of ecological acoustic events based on discriminant features
Although sound classification in soundscape studies are generally performed by experts, the large growth of acoustic data presents a major challenge for performing such task. At the same time, the identification of more discriminating features becomes crucial when analyzing soundscapes, and this occ...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Elsevier
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/acadc71de7004e639f929bbe409dcc16 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:acadc71de7004e639f929bbe409dcc16 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:acadc71de7004e639f929bbe409dcc162021-12-01T04:42:23ZVisualization and categorization of ecological acoustic events based on discriminant features1470-160X10.1016/j.ecolind.2020.107316https://doaj.org/article/acadc71de7004e639f929bbe409dcc162021-07-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S1470160X20312589https://doaj.org/toc/1470-160XAlthough sound classification in soundscape studies are generally performed by experts, the large growth of acoustic data presents a major challenge for performing such task. At the same time, the identification of more discriminating features becomes crucial when analyzing soundscapes, and this occurs because natural and anthropogenic sounds are very complex, particularly in Neotropical regions, where the biodiversity level is very high. In this scenario, the need for research addressing the discriminatory capability of acoustic features is of utmost importance to work towards automating these processes. In this study we present a method to identify the most discriminant features for categorizing sound events in soundscapes. Such identification is key to classification of sound events. Our experimental findings validate our method, showing high discriminatory capability of certain extracted features from sound data, reaching an accuracy of 89.91% for classification of frogs, birds and insects simultaneously. An extension of these experiments to simulate binary classification reached accuracy of 82.64%,100.0% and 99.40% for the classification between combinations of frogs-birds, frogs-insects and birds-insects, respectively.Liz Maribel Huancapaza HilasacaLucas Pacciullio GasparMilton Cezar RibeiroRosane MinghimElsevierarticleSoundscape ecologyDiscriminant featuresVisualizationClassificationFeature selectionEcologyQH540-549.5ENEcological Indicators, Vol 126, Iss , Pp 107316- (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Soundscape ecology Discriminant features Visualization Classification Feature selection Ecology QH540-549.5 |
spellingShingle |
Soundscape ecology Discriminant features Visualization Classification Feature selection Ecology QH540-549.5 Liz Maribel Huancapaza Hilasaca Lucas Pacciullio Gaspar Milton Cezar Ribeiro Rosane Minghim Visualization and categorization of ecological acoustic events based on discriminant features |
description |
Although sound classification in soundscape studies are generally performed by experts, the large growth of acoustic data presents a major challenge for performing such task. At the same time, the identification of more discriminating features becomes crucial when analyzing soundscapes, and this occurs because natural and anthropogenic sounds are very complex, particularly in Neotropical regions, where the biodiversity level is very high. In this scenario, the need for research addressing the discriminatory capability of acoustic features is of utmost importance to work towards automating these processes. In this study we present a method to identify the most discriminant features for categorizing sound events in soundscapes. Such identification is key to classification of sound events. Our experimental findings validate our method, showing high discriminatory capability of certain extracted features from sound data, reaching an accuracy of 89.91% for classification of frogs, birds and insects simultaneously. An extension of these experiments to simulate binary classification reached accuracy of 82.64%,100.0% and 99.40% for the classification between combinations of frogs-birds, frogs-insects and birds-insects, respectively. |
format |
article |
author |
Liz Maribel Huancapaza Hilasaca Lucas Pacciullio Gaspar Milton Cezar Ribeiro Rosane Minghim |
author_facet |
Liz Maribel Huancapaza Hilasaca Lucas Pacciullio Gaspar Milton Cezar Ribeiro Rosane Minghim |
author_sort |
Liz Maribel Huancapaza Hilasaca |
title |
Visualization and categorization of ecological acoustic events based on discriminant features |
title_short |
Visualization and categorization of ecological acoustic events based on discriminant features |
title_full |
Visualization and categorization of ecological acoustic events based on discriminant features |
title_fullStr |
Visualization and categorization of ecological acoustic events based on discriminant features |
title_full_unstemmed |
Visualization and categorization of ecological acoustic events based on discriminant features |
title_sort |
visualization and categorization of ecological acoustic events based on discriminant features |
publisher |
Elsevier |
publishDate |
2021 |
url |
https://doaj.org/article/acadc71de7004e639f929bbe409dcc16 |
work_keys_str_mv |
AT lizmaribelhuancapazahilasaca visualizationandcategorizationofecologicalacousticeventsbasedondiscriminantfeatures AT lucaspacciulliogaspar visualizationandcategorizationofecologicalacousticeventsbasedondiscriminantfeatures AT miltoncezarribeiro visualizationandcategorizationofecologicalacousticeventsbasedondiscriminantfeatures AT rosaneminghim visualizationandcategorizationofecologicalacousticeventsbasedondiscriminantfeatures |
_version_ |
1718405815962959872 |