Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset (1909.01300v3)

Published 3 Sep 2019 in cs.RO and eess.SP

Abstract: In this paper we present The Oxford Radar RobotCar Dataset, a new dataset for researching scene understanding using Millimetre-Wave FMCW scanning radar data. The target application is autonomous vehicles where this modality is robust to environmental conditions such as fog, rain, snow, or lens flare, which typically challenge other sensor modalities such as vision and LIDAR. The data were gathered in January 2019 over thirty-two traversals of a central Oxford route spanning a total of 280km of urban driving. It encompasses a variety of weather, traffic, and lighting conditions. This 4.7TB dataset consists of over 240,000 scans from a Navtech CTS350-X radar and 2.4 million scans from two Velodyne HDL-32E 3D LIDARs; along with six cameras, two 2D LIDARs, and a GPS/INS receiver. In addition we release ground truth optimised radar odometry to provide an additional impetus to research in this domain. The full dataset is available for download at: ori.ox.ac.uk/datasets/radar-robotcar-dataset

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Dan Barnes (9 papers)
  2. Matthew Gadd (32 papers)
  3. Paul Murcutt (2 papers)
  4. Paul Newman (59 papers)
  5. Ingmar Posner (77 papers)
Citations (347)

Summary

  • The paper introduces a groundbreaking radar extension to the Oxford RobotCar Dataset with 240,000 radar scans across 280 km.
  • It details a comprehensive multi-sensor setup including FMCW radar, LIDARs, and cameras for enhanced perception in varied environments.
  • The dataset features ground truth optimized radar odometry, offering a robust platform for advancing autonomous localization and navigation research.

An Overview of the Oxford Radar RobotCar Dataset

The paper "The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset" provides a detailed account of a novel dataset designed for advancing research in autonomous vehicle perception with Millimeter-Wave Frequency-Modulated Continuous-Wave (FMCW) radar. This dataset represents an extension of the Oxford RobotCar Dataset and offers a significant supplement to the existing data used in scene understanding and autonomous navigation by incorporating radar, a sensor modality that promises robustness in adverse environmental conditions where traditional sensors such as LIDAR and cameras face limitations.

Dataset Composition and Features

The Oxford Radar RobotCar Dataset includes data gathered over 32 traversals of a central Oxford route in January 2019, covering a total of 280 kilometers. The dataset is noteworthy for its size, encompassing 4.7 terabytes of data, and its diversity in environmental conditions, including varied weather, lighting, and traffic scenarios. Key components of the dataset involve scans from a Navtech CTS350-X radar, two Velodyne HDL-32E 3D LIDARs, six cameras, two 2D LIDARs, and a GPS/INS system. Notably, it offers over 240,000 radar scans and 2.4 million LIDAR scans. An essential feature of this dataset is the ground truth optimized radar odometry provided to support researchers in advancing their work on mobile robotics and autonomous vehicle localization.

Advantages and Applications of Radar

Radar, specifically FMCW radars, provide significant advantages in autonomous vehicle applications due to their robustness against environmental obscurants such as fog, rain, and snow. Unlike optical sensors, radar can consistently capture the surrounding environment under these challenging conditions. The 360-degree field of view and the long detection range provided by such radar systems are critical for safe navigation at higher speeds and in unstructured environments where traditional sensors might struggle.

Comparative Analysis with Existing Datasets

The dataset is positioned as a complement to existing vision-based and LIDAR-based autonomous driving datasets. While the contributions of LIDAR and vision modalities to urban autonomy have been substantial, the authors advocate for an increased adoption of radar, emphasizing its complementary strengths. Prior to this release, few datasets have utilized FMCW radar for mobile robotic applications, making this dataset uniquely valuable for exploring novel research avenues like radar-based mapping, navigation, and perception.

Implications for Future Research

The authors anticipate this dataset will catalyze new research areas that leverage radar's distinct competencies. Notably, the dataset provides a platform for investigating state estimation and odometry using radar data—an area that has not been extensively explored. Furthermore, the robustness of radar in varied environmental conditions positions it as a potential alternative or complement to existing sensing modalities in autonomous vehicles.

The introduction of this dataset has broad implications for both theoretical advancements and practical applications in autonomous vehicles and mobile robotics. On the theoretical side, it invites further exploration into the integration of radar data for complex scene understanding. Practically, the dataset supports the development of reliable autonomous systems capable of robust operation in all weather conditions.

Conclusion

The release of the Oxford Radar RobotCar Dataset marks an important step in extending autonomous driving research into the domain of radar sensors. By providing large-scale, diverse, and richly annotated radar data, the dataset empowers researchers to explore and validate novel algorithms and methodologies in autonomous driving and mobile robotics. Future research could involve further data collection to enhance the dataset's scope and the exploration of cross-modal sensor fusion, ultimately advancing the safety and reliability of autonomous navigation technologies.

Youtube Logo Streamline Icon: https://streamlinehq.com