Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.11861/10701
Title: Improving autonomous vehicle cognitive robustness in extreme weather with deep learning and thermal camera fusion
Authors: Dr. NAWAZ Mehmood 
Khan, Sheheryar 
Daud, Muhammad 
Asim, Muhammad 
Anwar, Ghazanfar Ali 
Shahid, Ali Raza 
Issue Date: 2025
Source: IEEE Open Journal of Vehicular Technology, 2025, vol. 6, pp. 426-441.
Journal: IEEE Open Journal of Vehicular Technology 
Abstract: In autonomous vehicles (AV), sensor fusion methods have proven to be effective in merging data from multiple sensors and enhancing their perception capabilities. In the context of sensor fusion, the distinct strengths of multi-sensors, such as LiDAR, RGB, Thermal sensors, etc., can be leveraged to mitigate the impact of challenges imposed by extreme weather conditions. In this paper, we address multi-sensor fusion in AVs and present a comprehensive integration of a thermal sensor aimed at enhancing the cognitive robustness of AVs. Thermal sensors possess an impressive capability to detect objects and hazards that may be imperceptible to traditional visible light sensors. When integrated with RGB and LiDAR sensors, the thermal sensor becomes highly beneficial for detecting and locating objects in adverse weather conditions. The proposed deep learning-assisted multi-sensor fusion technique consists of two parts: (1) visual information fusion and (2) object detection using LiDAR, RGB, and Thermal sensors. The visual fusion framework employs a CNN (convolutional neural network) inspired by a domain image fusion algorithm. The object detection framework uses the modified version of the YoloV8 model, which exhibits high accuracy in real-time detection. In the YoloV8 model, we adjusted the network architecture to incorporate additional convolutional layers and altered the loss function to enhance detection accuracy in foggy and rainy conditions. The proposed technique is effective and adaptable in challenging conditions, such as night or dark mode, smoke, and heavy rain. The experimental results of the proposed method demonstrate enhanced efficiency and cognitive robustness compared to state-of-the-art fusion and detection techniques. This is evident from tests conducted on two public datasets (FLIR and TarDAL) and one private dataset (CUHK).
Type: Peer Reviewed Journal Article
URI: http://hdl.handle.net/20.500.11861/10701
ISSN: 2644-1330
DOI: 10.1109/OJVT.2025.3529495
Appears in Collections:Applied Data Science - Publication

Show full item record

Page view(s)

10
checked on Feb 24, 2025

Google ScholarTM

Impact Indices

Altmetric

PlumX

Metrics


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.