SCaRL- A Synthetic Multi-Modal Dataset for Autonomous Driving

Konferenz: ICMIM 2024 - 7th IEEE MTT Conference
16.04.2024-17.04.2024 in Boppard

Tagungsband: ITG-Fb. 315: ICMIM 2024

Seiten: 4Sprache: EnglischTyp: PDF

Autoren:
Ramesh, Avinash Nittur; Correas-Serrano, Aitor; Gonzalez-Huici, Maria

Inhalt:
We present a novel synthetically generated multi-modal dataset, SCaRL, to enable the training and validation of autonomous driving solutions. Multi-modal datasets are essential to attain the robustness and high accuracy required by autonomous systems in applications such as autonomous driving. As deep learning-based solutions are becoming more prevalent for object detection, classification, and tracking tasks, there is great demand for datasets combining camera, lidar, and radar sensors. Existing real/synthetic datasets for autonomous driving lack synchronized data collection from a complete sensor suite. SCaRL provides synchronized Synthetic data from RGB, semantic/instance, and depth Cameras; Range-Doppler-Azimuth/Elevation maps and raw data from Radar; and 3D point clouds/2D maps of semantic, depth and Doppler data from coherent Lidar. SCaRL is a large dataset based on the CARLA Simulator, which provides data for diverse, dynamic scenarios and traffic conditions. SCaRL is the first dataset to include synthetic synchronized data from coherent Lidar and MIMO radar sensors.