Home

Awesome

pytorch-superpoint

This is a PyTorch implementation of "SuperPoint: Self-Supervised Interest Point Detection and Description." Daniel DeTone, Tomasz Malisiewicz, Andrew Rabinovich. ArXiv 2018. This code is partially based on the tensorflow implementation https://github.com/rpautrat/SuperPoint.

Please be generous to star this repo if it helps your research. This repo is a bi-product of our paper deepFEPE(IROS 2020).

Differences between our implementation and original paper

Results on HPatches

TaskHomography estimationDetector metricDescriptor metric
Epsilon = 135RepeatabilityMLENN mAPMatching Score
Pretrained model0.440.770.830.6061.140.810.55
Sift (subpixel accuracy)0.630.760.790.511.160.700.27
superpoint_coco_heat2_0_170k_hpatches_sub0.460.750.810.631.070.780.42
superpoint_kitti_heat2_0_50k_hpatches_sub0.440.710.770.560.950.780.41

Installation

Requirements

conda create --name py36-sp python=3.6
conda activate py36-sp
pip install -r requirements.txt
pip install -r requirements_torch.txt # install pytorch

Path setting

Dataset

Datasets should be downloaded into $DATA_DIR. The Synthetic Shapes dataset will also be generated there. The folder structure should look like:

datasets/ ($DATA_DIR)
|-- COCO
|   |-- train2014
|   |   |-- file1.jpg
|   |   `-- ...
|   `-- val2014
|       |-- file1.jpg
|       `-- ...
`-- HPatches
|   |-- i_ajuntament
|   `-- ...
`-- synthetic_shapes  # will be automatically created
`-- KITTI (accumulated folders from raw data)
|   |-- 2011_09_26_drive_0020_sync
|   |   |-- image_00/
|   |   `-- ...
|   |-- ...
|   `-- 2011_09_28_drive_0001_sync
|   |   |-- image_00/
|   |   `-- ...
|   |-- ...
|   `-- 2011_09_29_drive_0004_sync
|   |   |-- image_00/
|   |   `-- ...
|   |-- ...
|   `-- 2011_09_30_drive_0016_sync
|   |   |-- image_00/
|   |   `-- ...
|   |-- ...
|   `-- 2011_10_03_drive_0027_sync
|   |   |-- image_00/
|   |   `-- ...

run the code

tensorboard --logdir=./runs/ [--host | static_ip_address] [--port | 6008]

1) Training MagicPoint on Synthetic Shapes

python train4.py train_base configs/magicpoint_shapes_pair.yaml magicpoint_synth --eval

you don't need to download synthetic data. You will generate it when first running it. Synthetic data is exported in ./datasets. You can change the setting in settings.py.

2) Exporting detections on MS-COCO / kitti

This is the step of homography adaptation(HA) to export pseudo ground truth for joint training.

<!-- - you can export hpatches or coco dataset by editing the 'task' in config file -->
export_folder: <'train' | 'val'>  # set export for training or validation

General command:

python export.py <export task>  <config file>  <export folder> [--outputImg | output images for visualization (space inefficient)]

export coco - do on training set

python export.py export_detector_homoAdapt configs/magicpoint_coco_export.yaml magicpoint_synth_homoAdapt_coco

export coco - do on validation set

python export.py export_detector_homoAdapt configs/magicpoint_coco_export.yaml magicpoint_synth_homoAdapt_coco

export kitti

python export.py export_detector_homoAdapt configs/magicpoint_kitti_export.yaml magicpoint_base_homoAdapt_kitti
<!-- #### export tum - config - check the 'root' in config file - set 'datasets/tum_split/train.txt' as the sequences you have ``` python export.py export_detector_homoAdapt configs/magicpoint_tum_export.yaml magicpoint_base_homoAdapt_tum ``` -->

3) Training Superpoint on MS-COCO/ KITTI

You need pseudo ground truth labels to traing detectors. Labels can be exported from step 2) or downloaded from link. Then, as usual, you need to set config file before training.

General command

python train4.py <train task> <config file> <export folder> --eval

COCO

python train4.py train_joint configs/superpoint_coco_train_heatmap.yaml superpoint_coco --eval --debug

kitti

python train4.py train_joint configs/superpoint_kitti_train_heatmap.yaml superpoint_kitti --eval --debug

4) Export/ Evaluate the metrics on HPatches

Export

python export.py export_descriptor  configs/magicpoint_repeatability_heatmap.yaml superpoint_hpatches_test

evaluate

python evaluation.py <path to npz files> [-r, --repeatibility | -o, --outputImg | -homo, --homography ]

python evaluation.py logs/superpoint_hpatches_test/predictions --repeatibility --outputImg --homography --plotMatching

5) Export/ Evaluate repeatability on SIFT

# export detection, description, matching
python export_classical.py export_descriptor configs/classical_descriptors.yaml sift_test --correspondence

# evaluate (use 'sift' flag)
python evaluation.py logs/sift_test/predictions --sift --repeatibility --homography 

Pretrained models

Current best model

model from magicleap

pretrained/superpoint_v1.pth

Jupyter notebook

# show images saved in the folders
jupyter notebook
notebooks/visualize_hpatches.ipynb 

Updates (year.month.day)

Known problems

Work in progress

Citations

Please cite the original paper.

@inproceedings{detone2018superpoint,
  title={Superpoint: Self-supervised interest point detection and description},
  author={DeTone, Daniel and Malisiewicz, Tomasz and Rabinovich, Andrew},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops},
  pages={224--236},
  year={2018}
}

Please also cite our DeepFEPE paper.

@misc{2020_jau_zhu_deepFEPE,
Author = {You-Yi Jau and Rui Zhu and Hao Su and Manmohan Chandraker},
Title = {Deep Keypoint-Based Camera Pose Estimation with Geometric Constraints},
Year = {2020},
Eprint = {arXiv:2007.15122},
}

Credits

This implementation is developed by You-Yi Jau and Rui Zhu. Please contact You-Yi for any problems. Again the work is based on Tensorflow implementation by Rémi Pautrat and Paul-Edouard Sarlin and official SuperPointPretrainedNetwork. Thanks to Daniel DeTone for help during the implementation.

Posts

What have I learned from the implementation of deep learning paper?