A two-branch multi-scale residual attention network for single image super-resolution in remote sensing imagery
Date
Language
Embargo Lift Date
Department
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
Abstract
High-resolution remote sensing imagery finds applications in diverse fields, such as land-use mapping, crop planning, and disaster surveillance. To offer detailed and precise insights, reconstructing edges, textures, and other features is crucial. Despite recent advances in detail enhancement through deep learning, disparities between original and reconstructed images persist. To address this challenge, we propose a two-branch multiscale residual attention network for single-image super-resolution reconstruction. The network gathers complex information about input images from two branches with convolution layers of different kernel sizes. The two branches extract both low-level and high-level features from the input image. The network incorporates multiscale efficient channel attention and spatial attention blocks to capture channel and spatial dependencies in the feature maps. This results in more discriminative features and more accurate predictions. Moreover, residual modules with skip connections can help to overcome the vanishing gradient problem. We trained the proposed model on the WHU-RS19 dataset, collated from Google Earth satellite imagery, and validated it on the UC Merced, RSSCN7, AID, and real-world satellite datasets. The experimental results show that our network uses features at different levels of detail more effectively than state-of-the-art models.