Treffer: VisioDECT: a novel approach to drone detection using CBAM-integrated YOLO and GELAN-E models.
Weitere Informationen
Unmanned aerial vehicles have revolutionized logistics, environmental monitoring, and aerial surveillance. Their widespread use has created security concerns, specifically regarding illegal spying, smuggling, and hazardous substance movement. Maintaining public safety and protecting sensitive locations requires effective drone detection and payload assessment. Our article proposes a vision-based system for real-time drone identification and classification utilizing YOLOv5, YOLOv8, and GELAN-E deep learning models, enhanced with novel attention mechanisms and interpretability techniques. By integrating the Convolutional Block Attention Module into the YOLO architecture, which is named AttnYOLO, the system enhances feature extraction and focuses on the most relevant regions in an image. This improvement in spatial and channel attention significantly boosts detection performance, particularly for small and occluded drones. Additionally, we employ Gradient-weighted Class Activation Mapping (EigenCAM) for visualizing model focus during detection, increasing the system's transparency and interpretability. VisioDECT comprises 20,924 annotated photographs of six drone models in overcast, sunny, and evening circumstances. Under cloudy conditions, DenseNet201 achieved 100% classification accuracy, while DarkNet53 and InceptionV3 reached 99.99% and 99.9%, respectively. In evening scenarios, InceptionV3 had 100% accuracy, followed by DarkNet53 with 99.98%. Our proposed model, GELAN-E, excelled in detection and classification. In overcast settings, GELAN-E outperformed YOLOv8 with an accuracy of 0.988, a recall of 0.994, and a mAP50-95 score of 0.688. For evening conditions, GELAN-E achieved a higher mAP50-95 score of 0.642 compared to YOLOv8. These results demonstrate that the inclusion of attention mechanisms, along with visual interpretability, enhances drone detection performance, particularly in low-light and challenging environments, making this system ideal for real-time drone detection in civilian and military applications. [ABSTRACT FROM AUTHOR]
Copyright of Neural Computing & Applications is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)