Developing point clouds registration, SLAM or place recognition algorithms requires data with a high quality ground truth (usually composed of a position and orientation). Moreover, many machine learning algorithms require large amounts of data for training. However, acquiring this kind of data in non-urban outdoor environments poses several challenges. First of all, off-road robots are usually very expensive. Above all, producing an accurate ground truth is problematic. Even the best sensor available, i.e. RTK GPS, cannot guarantee the required accuracy in every condition. Hence the scarcity of this kind of dataset for point clouds registration or SLAM in off-road conditions. For these reasons, we propose a synthetic dataset generated using Grand Theft Auto V (GTAV), a video game that accurately simulates sensing in outdoor environments. The data production technique is based on DeepGTAV-PreSIL [1]: a simulated LiDAR and a camera are installed on a vehicle which is driven through the GTAV map. Since one of the goals of our work is to produce a large amount of data to train neural networks which will then be used with real data, we have chosen the characteristics of the sensors to accurately simulate real ones. The proposed dataset is composed of 16.207 point clouds and images, divided into five sequences representing different environments, such as fields, woods and mountains. For each pair of point clouds and images we also provide the ground truth pose of the vehicle at the acquisition.

Curnis, G., Fontana, S., Sorrenti, D. (2022). GTASynth: 3D synthetic data of outdoor non-urban environments. DATA IN BRIEF, 43(August 2022) [10.1016/j.dib.2022.108412].

GTASynth: 3D synthetic data of outdoor non-urban environments

Fontana, Simone
Co-primo
;
Sorrenti, Domenico G
Secondo
2022

Abstract

Developing point clouds registration, SLAM or place recognition algorithms requires data with a high quality ground truth (usually composed of a position and orientation). Moreover, many machine learning algorithms require large amounts of data for training. However, acquiring this kind of data in non-urban outdoor environments poses several challenges. First of all, off-road robots are usually very expensive. Above all, producing an accurate ground truth is problematic. Even the best sensor available, i.e. RTK GPS, cannot guarantee the required accuracy in every condition. Hence the scarcity of this kind of dataset for point clouds registration or SLAM in off-road conditions. For these reasons, we propose a synthetic dataset generated using Grand Theft Auto V (GTAV), a video game that accurately simulates sensing in outdoor environments. The data production technique is based on DeepGTAV-PreSIL [1]: a simulated LiDAR and a camera are installed on a vehicle which is driven through the GTAV map. Since one of the goals of our work is to produce a large amount of data to train neural networks which will then be used with real data, we have chosen the characteristics of the sensors to accurately simulate real ones. The proposed dataset is composed of 16.207 point clouds and images, divided into five sequences representing different environments, such as fields, woods and mountains. For each pair of point clouds and images we also provide the ground truth pose of the vehicle at the acquisition.
Articolo in rivista - Articolo scientifico
3D Data; Grand theft Auto 5; mapping; Point Cloud; Point Cloud Registration; Robotics; SLAM; Synthetic Dataset;
English
Curnis, G., Fontana, S., Sorrenti, D. (2022). GTASynth: 3D synthetic data of outdoor non-urban environments. DATA IN BRIEF, 43(August 2022) [10.1016/j.dib.2022.108412].
Curnis, G; Fontana, S; Sorrenti, D
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S2352340922006096-main.pdf

accesso aperto

Descrizione: Data Article
Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Dimensione 1.6 MB
Formato Adobe PDF
1.6 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/388640
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact