Home

Awesome

argoverse_cbgs_kf_tracker

Precomputed 3D Detections

The precomputed 3D detections were computed on the Argoverse dataset using the method described in Class-balanced Grouping and Sampling for Point Cloud 3D Object Detection, with detection range increased to 100 meters in each direction and pruned to ROI to match Argoverse annotation policy.

The detections can be freely downloaded at our 3d tracking competition page [.zip].

Kalman Filter Tracking

This code extends AB3DMOT, subject to its license. However, instead of tracking in the camera coordinate frame (as AB3DMOT does), we perform tracking in the Argoverse city coordinate frame (see Argoverse paper and appendix).

Instead of greedily matching sporadic detections, we solve a number of independent estimation problems (filtering) in a factor graph. Specifically, we use the IoU metric to perform data association (decoupling the estimation problems), and then consider each 3D detection as a measurement of an unknown state for a particular vehicle.

Results on Argoverse Leaderboard

As of Wednesday April 15, 2020 this implementation took 1st place on the Argoverse 3d tracking test set (leaderboard). Several per-metric results are here:

Car <br> MOTAPedestrian <br>MOTACar <br> MOTPDPedestrian <br> MOTPDCar MT <br> (Mostly Tracked)Pedestrian MT <br> (Mostly Tracked)Car <br> FNPed. <br> FN
65.9048.310.340.370.510.2823,59425,780

Choice of Coordinate Frame

Tracking in the "city frame" is advantageous over tracking in the egovehicle frame or camera coordinate frame since parked cars are constant in the city frame. You can find our technical report here (runner up at Neurips 19 Argoverse 3D Tracking Competition that used less high-quality detections from PointPillars, achieving 48.33 Car MOTA).

Running the Code

First, install the argoverse-api module from here. Also download the data (egovehicle poses will be necessary),

Next, download the detections zip file, unzip them.

To run the tracker, pass the path to the unzipped detections directory, which should end in argoverse_detections_2020, to run_ab3dmot.py, as shown below:

DETECTIONS_DATAROOT="/path/to/argoverse_detections_2020" # replace with your own path
POSE_DIR="/path/to/argoverse/data" # should be either val or test set directory
SPLIT="val" # should be either 'val' or 'test'
python run_ab3dmot.py --dets_dataroot $DETECTIONS_DATAROOT --pose_dir $POSE_DIR --split $SPLIT
<p align="left"> <img src="videos/de6c96c4-f2b2-3f0f-9971-ed35f4118c1e_ring_front_center_30fps.gif" height="280"> <img src="videos/21e37598-52d4-345c-8ef9-03ae19615d3d_ring_front_center_30fps.gif" height="280"> </p> <p align="center"> <img src="videos/1e5d7745-c7b3-31a0-ae57-c480fcaa220e_ring_front_center_30fps.gif" height="280"> </p>

Brief Explanation of Repo Contents

Citing this work

Open-source Implementation

@misc{
    author = {John Lambert},
    title = {Open Argoverse CBGS-KF Tracker},
    howpublished={\url{https://github.com/johnwlambert/argoverse_cbgs_kf_tracker}},
    year = {2020},
}

License

This code is provided by myself for purely non-commercial, research purposes. It may not be used commercially in a product without my permission.