Towards Single Sand-Dust Image Restoration via Vision Transformer with Multi-Scale Feature Aggregation

Published in IEEE 21st India Council International Conference (INDICON), 2024

Authors: Romala Mishra, Sobhan Kanti Dhara, Anusha Vupputuri, Sukhadev Meher

Sand-dust images contain suspended dust particles that degrade the image quality in terms of visibility and, illumination leading to hazy images, distorted depth perception, color shifts, and wavelength scattering causing tints in images which impacts various vision based tasks. Many existing Sand-dust restoration techniques struggle to preserve fine details during restoration and face difficulty in varying Sand-dust conditions. To address these challenges, we propose a vision transformer-based framework that employs shifted window attention and depth-wise convolution for efficient local attention computation. It incorporates layer normalization that is revised to enhance stability and is integrated with a multi-scale feature aggregation module that allows the framework to address varying levels of haze by balancing the extraction of global contextual information with the preservation of local details and reducing overhead computation. Overall, our framework is found to not only effectively restore varying Sand-dust conditions as compared to other state-of-the-art frameworks but also address color shifts, color tints, blurred images, distorted depth perception, and restores the fine details of the affected image. [Website]

Leave a Comment