Home

Awesome

Watch the video

Latest news:

<!-- - **14/03/2022:** **RADIal full resolution is available now! Check the download section.** We moved the dataset to a new location to guarantee enough download bandwith. -->

RADIal dataset

RADIal stands for “Radar, Lidar et al.” It's a collection of 2-hour of raw data from synchronized automotive-grade sensors (camera, laser, High Definition radar) in various environments (citystreet, highway, countryside road) and comes with GPS and vehicle’s CAN traces.

RADIal contains 91 sequences of 1 to 4 minutes in duration, for a total of 2 hours. These sequences are categorized in highway, country-side and city driving. The distribution of the sequences is indicated in the figure below. Each sequence contains raw sensor signals recorded with their native frame rate. There are approximately 25,000 frames with the three sensors synchronized, out of which 8,252 are labelled with a total of 9,550 vehicles.<br/>

<p align="center"> <img src="images/RADIal_statistics.png" width="360" height="320"> </p>

If you find this code useful for your research, please cite our paper:

@InProceedings{Rebut_2022_CVPR,
               author = {Rebut, Julien and Ouaknine, Arthur and Malik, Waqas and P\'erez, Patrick},
               title = {Raw High-Definition Radar for Multi-Task Learning},
               booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
               month = {June},
               year = {2022},
               pages = {17021-17030}
               }

Sensor specifications

<p align="center"> <img src="images/vehicle.png"> </p>

Central to the RADIal dataset, our high-definition radar is composed of NRx=16 receiving antennas and NTx= 12 transmitting antennas, leading to NRx·NTx= 192 virtual antennas. This virtual-antenna array enables reaching a high azimuth angular resolution while estimating objects’ elevation angles as well. As the radar signal is difficult to interpret by annotators and practitioners alike, a 16-layer automotive-grade laser scanner (LiDAR) and a 5 Mpix RGB camera are also provided. The camera is placed below the interior mirror behind the windshield while the radar and the LiDAR are installed in the middle of the front ventilation grid, one above the other. The three sensors have parallel horizontallines of sight, pointing in the driving direction. Their extrinsic parameters are provided together with the dataset. RADIal also offers synchronized GPS and CAN traces which give access to the geo-referenced position of the vehicle as well as its driving information such as speed, steering wheelangle and yaw rate. The sensors’ specifications are detailed in the table below.<br/>

<p align="center"> <img src="images/Sensor_Specs.png" width="460" height="260" > </p>

Dataset structure

RADIal is a unique folder containing all the recorded sequences. Each sequence is a folder containing:

We provide in a Python library DBReader to read the data. Because all the radar data are recorded in a RAW format, that is to say the signal after the Analog to Digital Conversion (ADC), we provided too an optimized Python library SignalProcessing to process the Radar signal and generate either the Power Spectrums, the Point Cloud or the Range-Azimuth map.

Labels

Out of the 25,000 synchronized frames, 8,252 frames are labelled. Labels for vehicles are stored in a separated csv file. Each label containg the following information:

Note that -1 in all field means a frame without any label.

Labels for the Free-driving-space is provided as a segmentaion mask saved in a png file.

Download instructions

To download the raw dataset, please visit the following GoogleDrive

You will have then to use the SignalProcessing library to generate data for each modalities upon your need.

We provide too a "ready to use" dataset that can be loaded with the PyTorch data loader example provided in the Loader folder.