Repository logo
Research Outputs
Researchers
Organizations
Projects
Events
Theses
Statistics
Log In
  1. Home
  2. Research Output - Subject

Browsing by Research Output - Subject "Accuracy"

Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Publication
    Improving autonomous vehicle cognitive robustness in extreme weather with deep learning and thermal camera fusion
    (2025)
    Dr. NAWAZ Mehmood  
    ;
    Khan, Sheheryar  
    ;
    Daud, Muhammad  
    ;
    Asim, Muhammad  
    ;
    Anwar, Ghazanfar Ali  
    ;
    Shahid, Ali Raza  
    In autonomous vehicles (AV), sensor fusion methods have proven to be effective in merging data from multiple sensors and enhancing their perception capabilities. In the context of sensor fusion, the distinct strengths of multi-sensors, such as LiDAR, RGB, Thermal sensors, etc., can be leveraged to mitigate the impact of challenges imposed by extreme weather conditions. In this paper, we address multi-sensor fusion in AVs and present a comprehensive integration of a thermal sensor aimed at enhancing the cognitive robustness of AVs. Thermal sensors possess an impressive capability to detect objects and hazards that may be imperceptible to traditional visible light sensors. When integrated with RGB and LiDAR sensors, the thermal sensor becomes highly beneficial for detecting and locating objects in adverse weather conditions. The proposed deep learning-assisted multi-sensor fusion technique consists of two parts: (1) visual information fusion and (2) object detection using LiDAR, RGB, and Thermal sensors. The visual fusion framework employs a CNN (convolutional neural network) inspired by a domain image fusion algorithm. The object detection framework uses the modified version of the YoloV8 model, which exhibits high accuracy in real-time detection. In the YoloV8 model, we adjusted the network architecture to incorporate additional convolutional layers and altered the loss function to enhance detection accuracy in foggy and rainy conditions. The proposed technique is effective and adaptable in challenging conditions, such as night or dark mode, smoke, and heavy rain. The experimental results of the proposed method demonstrate enhanced efficiency and cognitive robustness compared to state-of-the-art fusion and detection techniques. This is evident from tests conducted on two public datasets (FLIR and TarDAL) and one private dataset (CUHK).
    Type:Peer Reviewed Journal Article
    DOI:10.1109/OJVT.2025.3529495
Get Involved!
  • Source Code
  • Documentation
  • Slack Channel
Make it your own

DSpace-CRIS can be extensively configured to meet your needs. Decide which information need to be collected and available with fine-grained security. Start updating the theme to match your Institution's web identity.

Need professional help?

The original creators of DSpace-CRIS at 4Science can take your project to the next level, get in touch!

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Send Feedback
Repository logo COAR Notify