Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RADIATE: A Radar Dataset for Automotive Perception in Bad Weather (2010.09076v3)

Published 18 Oct 2020 in cs.CV and cs.RO

Abstract: Datasets for autonomous cars are essential for the development and benchmarking of perception systems. However, most existing datasets are captured with camera and LiDAR sensors in good weather conditions. In this paper, we present the RAdar Dataset In Adverse weaThEr (RADIATE), aiming to facilitate research on object detection, tracking and scene understanding using radar sensing for safe autonomous driving. RADIATE includes 3 hours of annotated radar images with more than 200K labelled road actors in total, on average about 4.6 instances per radar image. It covers 8 different categories of actors in a variety of weather conditions (e.g., sun, night, rain, fog and snow) and driving scenarios (e.g., parked, urban, motorway and suburban), representing different levels of challenge. To the best of our knowledge, this is the first public radar dataset which provides high-resolution radar images on public roads with a large amount of road actors labelled. The data collected in adverse weather, e.g., fog and snowfall, is unique. Some baseline results of radar based object detection and recognition are given to show that the use of radar data is promising for automotive applications in bad weather, where vision and LiDAR can fail. RADIATE also has stereo images, 32-channel LiDAR and GPS data, directed at other applications such as sensor fusion, localisation and mapping. The public dataset can be accessed at http://pro.hw.ac.uk/radiate/.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Marcel Sheeny (4 papers)
  2. Emanuele De Pellegrin (2 papers)
  3. Saptarshi Mukherjee (1 paper)
  4. Alireza Ahrabian (5 papers)
  5. Sen Wang (164 papers)
  6. Andrew Wallace (9 papers)
Citations (183)

Summary

Evaluating Autonomous Driving Perception with RADIATE: A Radar Dataset for Adverse Weather Conditions

The paper "RADIATE: A Radar Dataset for Automotive Perception in Bad Weather" introduces an innovative evaluation framework focusing on the integration of radar data for enhancing autonomous driving capabilities. This dataset extends beyond conventional datasets by addressing the challenges posed by adverse weather conditions such as rain, fog, and snowfall, leveraging radar technology to tackle the shortcomings of optical sensors like cameras and LiDAR under such conditions.

Dataset Composition and Novel Attributes

RADIATE features 3 hours of radar data complemented by stereo images, LiDAR, and GPS information, with over 200,000 annotated objects across multiple driving scenarios. The annotated radar images capture 8 distinct categories, including cars, vans, buses, and pedestrians, under varied conditions such as sun, rain, fog, and snow. Notably, it distinguishes itself by being the first high-resolution radar dataset with extensive object labels on public roads, collected in adverse weather environments.

Challenges and Opportunities in Adverse Weather Perception

Autonomous vehicles primarily depend on camera and LiDAR systems for environmental perception, yet these sensors exhibit diminished effectiveness in adverse weather due to phenomena such as attenuation and scattering. Radar, in contrast, offers a robust sensing modality unaffected by visibility impairments. RADIATE demonstrates radar's applicability in detecting and tracking road actors under challenging conditions where optical systems might falter. Specifically, the dataset reveals radar's potential in snow and fog where optical markers become unreliable.

Implementation and Evaluation of Radar-Based Detection

The authors utilize radar for object detection within diverse environmental settings, providing baseline results through deep learning models based on Faster R-CNN architecture. The models trained with combined good and adverse weather data evidenced promising performance with an Average Precision (AP) metric congruent across varied conditions. Despite optical sensors' sensitivity to lighting and weather, radar-based detection maintained reliability, showcasing radar's efficacy in urban, suburban, and motorway tests under snow, rain, and night scenarios.

Implications and Future Research Directions

RADIATE is poised to significantly impact research in autonomous driving by supplying a comprehensive dataset to evaluate radar-based perception systems' efficiency in handling adverse weather conditions. Future work may explore the dataset’s integration with machine learning methodologies tailored for radar imagery, advancing capabilities in end-to-end driving operations, SLAM, and sensor fusion. The dataset could stimulate significant advancements within autonomous systems, expanding their operational viability across varied environments.

Further investigation could integrate motion dynamics and Doppler information for enriched tracking capabilities, enabling more precise scene understanding and actor behavior prediction. Improving radar sensors’ spatial resolution and further aligning sensor fusion algorithms can refine object detection accuracy, fostering improved synergistic interaction between radar and other sensor modalities.

In conclusion, the RADIATE dataset offers a substantial contribution to autonomous driving research, with radar enabling reliable perception even under adverse conditions that compromise traditional optical sensors. This addition to existing datasets provides a critical foundation for evolving vehicular autonomy and safety standards in diverse environmental landscapes.