Awesome
EdgeDepth
This is the reference PyTorch implementation for training and testing depth estimation models using the method described in
The Edge of Depth: Explicit Constraints between Segmentation and Depth
⚙️ Setup
-
Compile Morphing operaiton:
We implement a customized Morphing Operation in our evaluation and training codes. You can still do training and evaluation without it with a sacrifice of performance. To enable it, you can do as follows:
-
Guranttee your computer's cuda version the same as your pytorch cuda version.
-
Type:
cd bnmorph python setup.py install cd ..
You should be able to successfully compile it if you can compile cuda codes in this Pytorch Tutorial
-
-
Prepare Kitti Data: We use Kitti Raw Dataset as well as predicted semantics label from this Paper.
- To download Kitti Raw Data
wget -i splits/kitti_archives_to_download.txt -P kitti_data/
- Use thins Link to download precomputed semantics Label
⏳ Training
Training Code will be released soon.
📊 evaluation
-
Pretrained Model is available here
-
Precompute GroundTruth DepthMap
python export_gt_depth.py --data_path [Your Kitti Raw Data Address] --split eigen
-
To Evaluate without using Morphing, use command:
python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \ --num_layers 50 --post_process
To Evaluate using Morphing, use command:
python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \ --num_layers 50 --post_process --bnMorphLoss --load_semantics --seman_path [Your Predicted Semantic Label Address]
-
You should get performance similar to Entry "Ours" listed in the table:
Method Name Use Lidar Groundtruth? Is morphed? KITTI abs. rel. error delta < 1.25 BTS Yes No 0.091 0.904 Depth Hints No No 0.096 0.890 Ours No No 0.091 0.898 Ours No Yes 0.090 0.899
🖼 Running on your own images
To run on your own images, run:
python test_simple.py --image_path <your_image_path>
--model_path <your_model_path>
--num_layers <18 or 50>
This will save depths as a numpy array (in original resolution), and a colormapped depth and disparity image.
Acknowledgment
Quite a few our code base come from Monodepth2 and Depth Hints