FOCUS FUSION NETWORK FOR VISIBLE AND INFRARED IMAGE FUSION

Yihan Zhang, Yichu Fang, Qian Zhang

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Image fusion techniques are commonly used to combine visible and infrared channels. The composite image should retain as much texture information from the visible channel and thermal information from the infrared channel as possible, while balancing these two features can be a challenge for practical applications. In this paper we propose a method for performing efficient and robust double-channel image fusion using self-attention and mutual cross-attention, along with a novel heatmap-based focusing loss to optimize the training process. The experimental results show that our approach significantly improves the details of fused images, and demonstrates the generalizability of our method under different scenes.

Original languageEnglish
Title of host publication2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3850-3854
Number of pages5
ISBN (Electronic)9798350344851
DOIs
Publication statusPublished - 2024
Event49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Seoul, Korea, Republic of
Duration: 14 Apr 202419 Apr 2024

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Conference

Conference49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
Country/TerritoryKorea, Republic of
CitySeoul
Period14/04/2419/04/24

Keywords

  • Cross-attention
  • Image fusion
  • Infrared image
  • Self-attention
  • ViT

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'FOCUS FUSION NETWORK FOR VISIBLE AND INFRARED IMAGE FUSION'. Together they form a unique fingerprint.

Cite this