SCaRL- A Synthetic Multi-Modal Dataset for Autonomous Driving

Conference: ICMIM 2024 - 7th IEEE MTT Conference
04/16/2024 - 04/17/2024 at Boppard

Proceedings: ITG-Fb. 315: ICMIM 2024

Pages: 4Language: englishTyp: PDF

Authors:
Ramesh, Avinash Nittur; Correas-Serrano, Aitor; Gonzalez-Huici, Maria

Abstract:
We present a novel synthetically generated multi-modal dataset, SCaRL, to enable the training and validation of autonomous driving solutions. Multi-modal datasets are essential to attain the robustness and high accuracy required by autonomous systems in applications such as autonomous driving. As deep learning-based solutions are becoming more prevalent for object detection, classification, and tracking tasks, there is great demand for datasets combining camera, lidar, and radar sensors. Existing real/synthetic datasets for autonomous driving lack synchronized data collection from a complete sensor suite. SCaRL provides synchronized Synthetic data from RGB, semantic/instance, and depth Cameras; Range-Doppler-Azimuth/Elevation maps and raw data from Radar; and 3D point clouds/2D maps of semantic, depth and Doppler data from coherent Lidar. SCaRL is a large dataset based on the CARLA Simulator, which provides data for diverse, dynamic scenarios and traffic conditions. SCaRL is the first dataset to include synthetic synchronized data from coherent Lidar and MIMO radar sensors.