Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Real-Time Drone Detection and Tracking With Visible, Thermal and Acoustic Sensors (2007.07396v2)

Published 14 Jul 2020 in cs.CV and eess.SP

Abstract: This paper explores the process of designing an automatic multi-sensor drone detection system. Besides the common video and audio sensors, the system also includes a thermal infrared camera, which is shown to be a feasible solution to the drone detection task. Even with slightly lower resolution, the performance is just as good as a camera in visible range. The detector performance as a function of the sensor-to-target distance is also investigated. In addition, using sensor fusion, the system is made more robust than the individual sensors, helping to reduce false detections. To counteract the lack of public datasets, a novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented (https://github.com/DroneDetectionThesis/Drone-detection-dataset). The database is complemented with an audio dataset of the classes drones, helicopters and background noise.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
Citations (83)

Summary

Real-Time Drone Detection and Tracking: A Multi-Sensor Approach

The paper "Real-Time Drone Detection and Tracking With Visible, Thermal and Acoustic Sensors" presents a comprehensive examination of an integrated multi-sensor system for automatic drone detection and tracking, highlighting the significance of combining visible, thermal infrared, and acoustic sensors. This paper aims to improve drone detection accuracy by overcoming the limitations of individual sensors through sensor fusion, and it contributes to the field by introducing a novel annotated dataset for this purpose.

The paper delineates the components and functionalities of the proposed detection system, which integrates various sensing mechanisms within a portable setup. The hardware architecture consists of an infrared camera, a video camera within the visible spectrum, and an acoustic sensor, all mounted on a pan/tilt platform managed by a laptop serving as the computational unit. The infrared camera, notably a FLIR Breach PTQ-136, complements the visible spectrum detection provided by a Sony HDR-CX405, allowing for robust detection under varying lighting conditions. Additionally, the inclusion of an ADS-B receiver caters to the detection of cooperative aircraft by decoding broadcast information such as position and identification.

The paper's approach is methodical, detailing the coordination of the sensing units managed via an integrated software suite. The signal processing pipeline includes Gaussian Mixture Models for motion detection and YOLO v2 for object recognition in video feeds. The acoustic detection employs Mel Frequency Cepstral Coefficients with a Long Short-Term Memory network for classifying drone audio signatures. The sensor fusion methodology, which extends beyond merely combining outputs from different sensors, is instrumental in ensuring real-time detection with minimized false positives.

As a pivotal part of the research, the authors supply the academic community with a valuable dataset captured across three airports in Sweden, addressing the deficiency of publicly available data necessary for benchmarking drone detection techniques. The dataset encompasses 650 videos along with audio samples, capturing multiple drone types and other aerial objects under diverse environmental conditions. The label set accounts for common confusion targets, such as birds and helicopters, enhancing the robustness of the applied detection and classification algorithms.

The results indicate that the performance of thermal infrared detection rivals that of the visible spectrum, with F1-scores of 0.7601 and 0.7849, respectively, highlighting the potential of infrared sensors in diverse atmospheric conditions. The acoustic classification yields an F1-score of 0.9323, demonstrating the efficacy of sound-based detection at closer ranges. Interestingly, the researchers explore detection range boundaries based on a modified Detect, Recognize, and Identify model, an innovation that caters to practical deployment constraints.

The implications of this research are multifaceted, spanning enhanced security measures surrounding sensitive venues to more accurate wildlife monitoring systems. The exploration of multi-sensor fusion represents a significant stride towards resilient autonomous surveillance systems, and the insights provided by the distance-based performance analysis invite further research into optimization algorithms that dynamically adjust detection thresholds based on estimated range.

Future research directions could explore integrating LiDAR or radar to enrich the sensing modalities, further enhancing performance in complex environments. The robustness of machine learning models, such as transitioning to YOLO v3 or incorporating continuously updated models through Transfer Learning approaches, could also be pivotal in extending operational effectiveness across various drone configurations. Indeed, the development and deployment of such systems will require continuous adaptation to emerging drone technologies, regulatory frameworks, and the operational environments they are deployed within.

In summary, the authors have provided both a practical system and an invaluable dataset that significantly advance the ongoing efforts in autonomous drone detection, with scope for extending this research into broader fields requiring real-time object detection and classification.