Home

Awesome

<img src="docs/trailab.png" align="right" width="20%">

CaDDN

CaDDN is a monocular-based 3D object detection method. This repository is based off of [OpenPCDet].

Categorical Depth Distribution Network for Monocular 3D Object Detection
Cody Reading, Ali Harakeh, Julia Chae, and Steven L. Waslander
[Paper]

Overview

Changelog

[2021-03-16] CaDDN v0.3.0 is released.

Introduction

What does CaDDN do?

CaDDN is a general PyTorch-based method for 3D object detection from monocular images. At the time of submission, CaDDN achieved first 1st place among published monocular methods on the Kitti 3D object detection benchmark. We welcome contributions to this project.

CaDDN design pattern

We inherit the design pattern from [OpenPCDet].

<p align="center"> <img src="docs/dataset_vs_model.png" width="95%" height="320"> </p>

Model Zoo

KITTI 3D Object Detection Baselines

Selected supported methods are shown in the below table. The results are the 3D detection performance of Car class on the val set of KITTI dataset.

training timeEasy@R40Moderate@R40Hard@R40download
CaDDN~76 hours23.7716.0713.61model-774M

Installation

Please refer to INSTALL.md for the installation of CaDDN.

Getting Started

Please refer to GETTING_STARTED.md to learn more usage about this project.

License

CaDDN is released under the Apache 2.0 license.

Acknowledgement

CaDDN is an open source project for monocular-based 3D scene perception. We would like to thank the authors of OpenPCDet for their open-source release of their 3D object detection codebase.

Citation

If you find this project useful in your research, please consider citing:

@article{CaDDN,
    title={Categorical Depth DistributionNetwork for Monocular 3D Object Detection},
    author={Cody Reading and
            Ali Harakeh and
            Julia Chae and
            Steven L. Waslander},
    journal = {CVPR},
    year={2021}
}

Contribution

Welcome to be a member of the CaDDN development team by contributing to this repo, and feel free to contact us for any potential contributions.