A data assimilation framework that uses the Kullback-Leibler divergence.
The process of integrating observations into a numerical model of an evolving dynamical system, known as data assimilation, has become an essential tool in computational science. These methods, however, are computationally expensive as they typically involve large matrix multiplication and inversion...
Guardado en:
Autores principales: | , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/900f63dbf96d45dd8767f59a7ab06606 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:900f63dbf96d45dd8767f59a7ab06606 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:900f63dbf96d45dd8767f59a7ab066062021-12-02T20:19:32ZA data assimilation framework that uses the Kullback-Leibler divergence.1932-620310.1371/journal.pone.0256584https://doaj.org/article/900f63dbf96d45dd8767f59a7ab066062021-01-01T00:00:00Zhttps://doi.org/10.1371/journal.pone.0256584https://doaj.org/toc/1932-6203The process of integrating observations into a numerical model of an evolving dynamical system, known as data assimilation, has become an essential tool in computational science. These methods, however, are computationally expensive as they typically involve large matrix multiplication and inversion. Furthermore, it is challenging to incorporate a constraint into the procedure, such as requiring a positive state vector. Here we introduce an entirely new approach to data assimilation, one that satisfies an information measure and uses the unnormalized Kullback-Leibler divergence, rather than the standard choice of Euclidean distance. Two sequential data assimilation algorithms are presented within this framework and are demonstrated numerically. These new methods are solved iteratively and do not require an adjoint. We find them to be computationally more efficient than Optimal Interpolation (3D-Var solution) and the Kalman filter whilst maintaining similar accuracy. Furthermore, these Kullback-Leibler data assimilation (KL-DA) methods naturally embed constraints, unlike Kalman filter approaches. They are ideally suited to systems that require positive valued solutions as the KL-DA guarantees this without need of transformations, projections, or any additional steps. This Kullback-Leibler framework presents an interesting new direction of development in data assimilation theory. The new techniques introduced here could be developed further and may hold potential for applications in the many disciplines that utilize data assimilation, especially where there is a need to evolve variables of large-scale systems that must obey physical constraints.Sam PimentelYoussef QranfalPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 16, Iss 8, p e0256584 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Medicine R Science Q |
spellingShingle |
Medicine R Science Q Sam Pimentel Youssef Qranfal A data assimilation framework that uses the Kullback-Leibler divergence. |
description |
The process of integrating observations into a numerical model of an evolving dynamical system, known as data assimilation, has become an essential tool in computational science. These methods, however, are computationally expensive as they typically involve large matrix multiplication and inversion. Furthermore, it is challenging to incorporate a constraint into the procedure, such as requiring a positive state vector. Here we introduce an entirely new approach to data assimilation, one that satisfies an information measure and uses the unnormalized Kullback-Leibler divergence, rather than the standard choice of Euclidean distance. Two sequential data assimilation algorithms are presented within this framework and are demonstrated numerically. These new methods are solved iteratively and do not require an adjoint. We find them to be computationally more efficient than Optimal Interpolation (3D-Var solution) and the Kalman filter whilst maintaining similar accuracy. Furthermore, these Kullback-Leibler data assimilation (KL-DA) methods naturally embed constraints, unlike Kalman filter approaches. They are ideally suited to systems that require positive valued solutions as the KL-DA guarantees this without need of transformations, projections, or any additional steps. This Kullback-Leibler framework presents an interesting new direction of development in data assimilation theory. The new techniques introduced here could be developed further and may hold potential for applications in the many disciplines that utilize data assimilation, especially where there is a need to evolve variables of large-scale systems that must obey physical constraints. |
format |
article |
author |
Sam Pimentel Youssef Qranfal |
author_facet |
Sam Pimentel Youssef Qranfal |
author_sort |
Sam Pimentel |
title |
A data assimilation framework that uses the Kullback-Leibler divergence. |
title_short |
A data assimilation framework that uses the Kullback-Leibler divergence. |
title_full |
A data assimilation framework that uses the Kullback-Leibler divergence. |
title_fullStr |
A data assimilation framework that uses the Kullback-Leibler divergence. |
title_full_unstemmed |
A data assimilation framework that uses the Kullback-Leibler divergence. |
title_sort |
data assimilation framework that uses the kullback-leibler divergence. |
publisher |
Public Library of Science (PLoS) |
publishDate |
2021 |
url |
https://doaj.org/article/900f63dbf96d45dd8767f59a7ab06606 |
work_keys_str_mv |
AT sampimentel adataassimilationframeworkthatusesthekullbackleiblerdivergence AT youssefqranfal adataassimilationframeworkthatusesthekullbackleiblerdivergence AT sampimentel dataassimilationframeworkthatusesthekullbackleiblerdivergence AT youssefqranfal dataassimilationframeworkthatusesthekullbackleiblerdivergence |
_version_ |
1718374176841007104 |