Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from...
Guardado en:
Autor principal: | |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Al-Khwarizmi College of Engineering – University of Baghdad
2015
|
Materias: | |
Acceso en línea: | https://doaj.org/article/089e2970b86f44d698e2c4e5695b9ae9 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:089e2970b86f44d698e2c4e5695b9ae9 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:089e2970b86f44d698e2c4e5695b9ae92021-12-02T07:35:47ZMulti-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform1818-1171https://doaj.org/article/089e2970b86f44d698e2c4e5695b9ae92015-10-01T00:00:00Zhttp://www.iasj.net/iasj?func=fulltext&aId=104627https://doaj.org/toc/1818-1171The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods. Iman M.G. Alwan Al-Khwarizmi College of Engineering – University of BaghdadarticleChemical engineeringTP155-156Engineering (General). Civil engineering (General)TA1-2040ENAl-Khawarizmi Engineering Journal, Vol 11, Iss 3, Pp 85-95 (2015) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Chemical engineering TP155-156 Engineering (General). Civil engineering (General) TA1-2040 |
spellingShingle |
Chemical engineering TP155-156 Engineering (General). Civil engineering (General) TA1-2040 Iman M.G. Alwan Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
description |
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods. |
format |
article |
author |
Iman M.G. Alwan |
author_facet |
Iman M.G. Alwan |
author_sort |
Iman M.G. Alwan |
title |
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
title_short |
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
title_full |
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
title_fullStr |
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
title_full_unstemmed |
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform |
title_sort |
multi-focus image fusion based on pixel significance using counterlet transform |
publisher |
Al-Khwarizmi College of Engineering – University of Baghdad |
publishDate |
2015 |
url |
https://doaj.org/article/089e2970b86f44d698e2c4e5695b9ae9 |
work_keys_str_mv |
AT imanmgalwan multifocusimagefusionbasedonpixelsignificanceusingcounterlettransform |
_version_ |
1718399319902519296 |