Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

eTraM: Event-based Traffic Monitoring Dataset (2403.19976v2)

Published 29 Mar 2024 in cs.CV

Abstract: Event cameras, with their high temporal and dynamic range and minimal memory usage, have found applications in various fields. However, their potential in static traffic monitoring remains largely unexplored. To facilitate this exploration, we present eTraM - a first-of-its-kind, fully event-based traffic monitoring dataset. eTraM offers 10 hr of data from different traffic scenarios in various lighting and weather conditions, providing a comprehensive overview of real-world situations. Providing 2M bounding box annotations, it covers eight distinct classes of traffic participants, ranging from vehicles to pedestrians and micro-mobility. eTraM's utility has been assessed using state-of-the-art methods for traffic participant detection, including RVT, RED, and YOLOv8. We quantitatively evaluate the ability of event-based models to generalize on nighttime and unseen scenes. Our findings substantiate the compelling potential of leveraging event cameras for traffic monitoring, opening new avenues for research and application. eTraM is available at https://eventbasedvision.github.io/eTraM

Definition Search Book Streamline Icon: https://streamlinehq.com
References (9)
  1. Event Camera Evaluation Kit 4 HD IMX636 Prophesee-Sony.
  2. High-speed tracking-by-detection without using image information. In International Workshop on Traffic and Street Surveillance for Safety and Security at IEEE AVSS 2017, Lecce, Italy, Aug. 2017.
  3. On event-based optical flow detection. Frontiers in neuroscience, 9:137, 2015.
  4. Recurrent vision transformers for object detection with event cameras, 2023.
  5. YOLO by Ultralytics, Jan. 2023.
  6. Hots: A hierarchy of event-based time-surfaces for pattern recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(7):1346–1359, 2017.
  7. Event-based vision meets deep learning on steering prediction for self-driving cars. In 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. IEEE, June 2018.
  8. Steering a predator robot using a mixed frame/event-driven convolutional neural network, 2016.
  9. Learning to detect objects with a 1 megapixel event camera. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 16639–16652. Curran Associates, Inc., 2020.
Citations (4)

Summary

  • The paper presents a novel event-camera dataset with over 2M annotations capturing diverse traffic scenarios.
  • The paper details a static high-resolution data acquisition method using the Prophesee EVK4 HD for precise motion tracking.
  • The paper demonstrates strong performance in detecting and tracking traffic participants under challenging lighting conditions.

Event-based Traffic Monitoring: Introducing the eTraM Dataset

Introduction to eTraM

The eTraM dataset represents a pioneering advancement in the field of traffic monitoring through the lens of event cameras. It encompasses a rich collection of data, extending beyond 10 hours, from diverse traffic scenarios under a range of lighting and weather conditions. This dataset provides an exhaustive view of the dynamic interactions amongst different traffic participants including pedestrians, vehicles of varying sizes, and micro-mobility elements such as bicycles, e-scooters, and wheelchairs. With over 2 million bounding box annotations spanning eight distinct classes of traffic participants, eTraM sets a new benchmark for event-based traffic monitoring research.

The dataset was meticulously compiled using the high-resolution Prophesee EVK4 HD event camera, capturing fine-grained motion details that are pivotal for understanding complex traffic dynamics. By offering a static perspective, it uniquely positions itself to aid in comprehensive traffic monitoring, capturing every nuance of traffic movement and interactions across different times of the day.

Dataset Acquisition and Characteristics

Event cameras, with their high temporal resolution and dynamic range, offer a novel perspective on visual sensing, providing asynchronous streams of brightness changes instead of conventional frames. This allows for capturing detailed motion information, especially useful in dynamic and fast-paced scenarios like traffic monitoring. The eTraM dataset harnesses these strengths of event cameras, collecting data across various urban settings around Arizona State University, ensuring diversity in traffic dynamics captured.

The dataset stands out not just for its extensive coverage of different traffic participants but also for its meticulous annotation process. Over 2 million bounding box annotations provide detailed information on a wide array of traffic elements, making it a highly valuable resource for the development and evaluation of traffic monitoring algorithms.

Distinguishing Features

One of the noteworthy features of eTraM is its integration of event-based data from a static perspective, a rarity in current datasets primarily focused on ego-motion perspectives. This static viewpoint brings a new dimension to traffic monitoring by capturing traffic dynamics without the additional motion noise from the sensor's movement. Furthermore, the dataset's inclusion of nighttime data significantly enhances its versatility, offering researchers the opportunity to explore the performance of event-based models under low-light conditions.

Evaluation and Findings

Preliminary evaluations using state-of-the-art methods like RVT, RED, and YOLOv8 highlight eTraM's potential to improve traffic participant detection and tracking. These evaluations underscore the dataset's significance in advancing the capabilities of event-based models, particularly under challenging conditions such as nighttime and unseen scenes. The results indicate notable performance in vehicle and pedestrian detection across various scenarios, emphasizing the dataset's practical relevance to real-world applications in intelligent transportation systems.

Future Implications

The eTraM dataset opens up new avenues for research in event-based traffic monitoring. Its unique composition and the wealth of annotations present opportunities for developing advanced traffic monitoring algorithms that can operate efficiently across different lighting and weather conditions. Additionally, the dataset's emphasis on static event camera data calls for novel algorithmic approaches that leverage the temporal granularity and high dynamic range of event cameras.

This dataset not only holds promise for enhancing traffic safety and management through improved monitoring but also for fostering innovation in leveraging event cameras for broader applications within intelligent transportation systems.

Final Thoughts

In conclusion, eTraM stands as a landmark contribution to the field of event-based traffic monitoring. Its comprehensive coverage, detailed annotations, and unique static perspective offer a highly valuable resource for researchers and practitioners alike. As the community continues to explore the full potentials of event cameras, eTraM will undoubtedly play a pivotal role in shaping future directions in traffic monitoring and beyond.