TY - GEN
T1 - Single Image Super-Resolution Using Inverted Residual and Channel-Wise Attention
AU - Hosen, Md Imran
AU - Islam, Md Baharul
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Single-image super-resolution (SISR) is the task of reconstructing a high-resolution image from a low-resolution image. Convolutional neural network (CNN)-based SISR techniques have demonstrated promising results. However, most CNN-based models cannot discriminate between different forms of information and treat them identically, which limits the models' ability to represent information. On the other hand, when a neural network's depth increases, the long-Term information from earlier layers is more likely to degrade in later levels, which leads to poor image SR performance. This research presents a single image super-resolution strategy employing inverted residual connection with channel-wise attention (IRCA) to preserve meaningful information and keep long-Term features while balancing performance and computational cost. The inverted residual block achieves long-Term information persistence with fewer parameters than traditional residual networks. Meanwhile, by explicitly modeling inter-dependencies between channels, the attention block progressively adjusts channel-wise feature responses, enhancing essential information and suppressing unnecessary information. The efficacy of our suggested approach is demonstrated in three publicly accessible datasets. Code is available at https://github.com/mdhosen/SISR_IResBlock
AB - Single-image super-resolution (SISR) is the task of reconstructing a high-resolution image from a low-resolution image. Convolutional neural network (CNN)-based SISR techniques have demonstrated promising results. However, most CNN-based models cannot discriminate between different forms of information and treat them identically, which limits the models' ability to represent information. On the other hand, when a neural network's depth increases, the long-Term information from earlier layers is more likely to degrade in later levels, which leads to poor image SR performance. This research presents a single image super-resolution strategy employing inverted residual connection with channel-wise attention (IRCA) to preserve meaningful information and keep long-Term features while balancing performance and computational cost. The inverted residual block achieves long-Term information persistence with fewer parameters than traditional residual networks. Meanwhile, by explicitly modeling inter-dependencies between channels, the attention block progressively adjusts channel-wise feature responses, enhancing essential information and suppressing unnecessary information. The efficacy of our suggested approach is demonstrated in three publicly accessible datasets. Code is available at https://github.com/mdhosen/SISR_IResBlock
KW - Channel-wise Attention Block
KW - Convolutional Neural Network (CNN)
KW - Image Super Resolution
KW - Inverted Residual Network
UR - http://www.scopus.com/inward/record.url?scp=85152265949&partnerID=8YFLogxK
U2 - 10.1109/ISPACS57703.2022.10082788
DO - 10.1109/ISPACS57703.2022.10082788
M3 - Conference contribution
AN - SCOPUS:85152265949
T3 - 2022 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2022
BT - 2022 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 International Symposium on Intelligent Signal Processing and Communication Systems, ISPACS 2022
Y2 - 22 November 2022 through 25 November 2022
ER -