Multimodal-Wireless: A Large-Scale Dataset for Sensing and Communication (2511.03220v1)
Abstract: This paper presents Multimodal-Wireless, an open-source multimodal sensing dataset designed for wireless communication research. The dataset is generated through an integrated and customizable data pipeline built upon the CARLA simulator and Sionna framework. It contains approximately 160,000 frames collected across four virtual towns, sixteen communication scenarios, and three weather conditions, encompassing multiple sensing modalities--communication channel, light detection and ranging, RGB and depth cameras, inertial measurement unit, and radar. This paper provides a comprehensive overview of the dataset, outlining its key features, overall framework, and technical implementation details. In addition, it explores potential research applications concerning communication and collaborative perception, exemplified by beam prediction using a multimodal LLM. The dataset is open in https://le-liang.github.io/mmw/.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.