Home

Awesome

EdgeDepth

This is the reference PyTorch implementation for training and testing depth estimation models using the method described in

The Edge of Depth: Explicit Constraints between Segmentation and Depth

Shengjie Zhu, Garrick Brazil and Xiaoming Liu

CVPR 2020

⚙️ Setup

  1. Compile Morphing operaiton:

    We implement a customized Morphing Operation in our evaluation and training codes. You can still do training and evaluation without it with a sacrifice of performance. To enable it, you can do as follows:

    1. Guranttee your computer's cuda version the same as your pytorch cuda version.

    2. Type:

    cd bnmorph
    python setup.py install
    cd ..
    

    You should be able to successfully compile it if you can compile cuda codes in this Pytorch Tutorial

  2. Prepare Kitti Data: We use Kitti Raw Dataset as well as predicted semantics label from this Paper.

    1. To download Kitti Raw Data
    wget -i splits/kitti_archives_to_download.txt -P kitti_data/
    
    1. Use thins Link to download precomputed semantics Label

⏳ Training

Training Code will be released soon.

📊 evaluation

  1. Pretrained Model is available here

  2. Precompute GroundTruth DepthMap

    python export_gt_depth.py --data_path [Your Kitti Raw Data Address] --split eigen
    
  3. To Evaluate without using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process
    

    To Evaluate using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process --bnMorphLoss --load_semantics --seman_path [Your Predicted Semantic Label Address]
    
  4. You should get performance similar to Entry "Ours" listed in the table:

    Method NameUse Lidar Groundtruth?Is morphed?KITTI abs. rel. errordelta < 1.25
    BTSYesNo0.0910.904
    Depth HintsNoNo0.0960.890
    OursNoNo0.0910.898
    OursNoYes0.0900.899

🖼 Running on your own images

To run on your own images, run:

python test_simple.py --image_path <your_image_path>
  --model_path <your_model_path>
  --num_layers <18 or 50>

This will save depths as a numpy array (in original resolution), and a colormapped depth and disparity image.

Acknowledgment

Quite a few our code base come from Monodepth2 and Depth Hints