Home

Awesome

Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth [Paper]

PWC PWC

Downloads

Google Colab

<p> <a href="https://colab.research.google.com/drive/1v6fzr4XusKdXAaeGZ1gKe1kh9Ce_WQhl?usp=sharing" target="_parent"> <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/> </a> </p> Thanks for the great Colab demo from NielsRogge

Requirements

Tested on

python==3.7.7
torch==1.6.0
h5py==3.6.0
scipy==1.7.3
opencv-python==4.5.5
mmcv==1.4.3
timm=0.5.4
albumentations=1.1.0
tensorboardX==2.4.1
gdown==4.2.1

You can install above package with

$ pip install -r requirements.txt

Or you can pull docker image with

$ docker pull doyeon0113/glpdepth

Inference and Evaluate

Dataset

NYU Depth V2
$ cd ./datasets
$ wget http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat
$ python ../code/utils/extract_official_train_test_set_from_mat.py nyu_depth_v2_labeled.mat splits.mat ./nyu_depth_v2/official_splits/
KITTI

Download annotated depth maps data set (14GB) from [link] into ./datasets/kitti/data_depth_annotated

$ cd ./datasets/kitti/data_depth_annotated/
$ unzip data_depth_annotated.zip

With above two instrtuctions, you can perform eval_with_pngs.py/test.py for NYU Depth V2 and eval_with_pngs for KITTI.

To fully perform experiments, please follow [BTS] repository to obtain full dataset for NYU Depth V2 and KITTI datasets.

Your dataset directory should be

root
- nyu_depth_v2
  - bathroom_0001
  - bathroom_0002
  - ...
  - official_splits
- kitti
  - data_depth_annotated
  - raw_data
  - val_selection_cropped

Evaluation

Inference

Train

for NYU Depth V2

$ python ./code/train.py --dataset nyudepthv2 --data_path ./datasets/ --max_depth 10.0 --max_depth_eval 10.0  

for KITTI

$ python ./code/train.py --dataset kitti --data_path ./datasets/ --max_depth 80.0 --max_depth_eval 80.0  --garg_crop

To-Do

License

For non-commercial purpose only (research, evaluation etc).

Citation

@article{kim2022global,
  title={Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth},
  author={Kim, Doyeon and Ga, Woonghyun and Ahn, Pyungwhan and Joo, Donggyu and Chun, Sehwan and Kim, Junmo},
  journal={arXiv preprint arXiv:2201.07436},
  year={2022}
}

References

[1] From Big to Small: Multi-Scale Local Planar Guidance for Monocular Depth Estimation. [code]

[2] SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers. [code]