Home

Awesome

<h1 align="center"> <a href=https://openaccess.thecvf.com/content/WACV2023/html/Conti_Sparsity_Agnostic_Depth_Completion_WACV_2023_paper.html>Sparsity Agnostic Depth Completion</a> </h1> <p> <div align="center"> <a href="https://andreaconti.github.io">Andrea Conti</a> &middot; <a href="https://mattpoggi.github.io">Matteo Poggi</a> &middot; <a href="http://vision.deis.unibo.it/~smatt/Site/Home.html">Stefano Mattoccia</a> </div> <div align="center"> <a href="https://arxiv.org/pdf/2212.00790.pdf">[Arxiv]</a> <a href="https://andreaconti.github.io/projects/sparsity_agnostic_depth_completion/">[Project Page]</a> </div> </p>

This repository provides the evaluation code for our WACV 2023 paper.

We present a novel depth completion approach agnostic to the sparsity of depth points, that is very likely to vary in many practical applications. State-of-the-art approaches yield accurate results only when processing a specific density and distribution of input points, i.e. the one observed during training, narrowing their deployment in real use cases. On the contrary, our solution is robust to uneven distributions and extremely low densities never witnessed during training. Experimental results on standard indoor and outdoor benchmarks highlight the robustness of our framework, achieving accuracy comparable to state-of-the-art methods when tested with density and distribution equal to the training one while being much more accurate in the other cases.

Citation

@InProceedings{Conti_2023_WACV,
    author    = {Conti, Andrea and Poggi, Matteo and Mattoccia, Stefano},
    title     = {Sparsity Agnostic Depth Completion},
    booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
    month     = {January},
    year      = {2023},
    pages     = {5871-5880}
}

Qualitative Results

To better visualize the performance of our proposal we provide a simple streamlit application, which can be executed in the following way:

$ git clone https://github.com/andreaconti/sparsity-agnostic-depth-Completion
$ cd sparsity-agnostic-depth-Completion
$ mamba env create -f environment.yml
$ mamba activate sparsity-agnostic-depth-Completion
$ streamlit run visualize.py

It may take a while when you change dataset or hints density to display since it have to download and unpack the data.

Quantitative Results

We provide precomputed depth maps for KITTI Depth Completion and NYU Depth V2, with different sparsity patterns.

Moreover we provide a simple evaluation script to compute metrics:

$ git clone https://github.com/andreaconti/sparsity-agnostic-depth-Completion
$ cd sparsity-agnostic-depth-Completion
$ mamba env create -f environment.yml
$ mamba activate sparsity-agnostic-depth-Completion
$ python evaluate.py <kitti-official | nyu-depth-v2-ma-downsampled> <hints density>

For instance:

# KITTI evaluation
$ python evaluate.py kitti-official lines64
# NYU Depth V2 evaluation
$ python evaluate.py nyu-depth-v2-ma-downsampled 500