Home

Awesome

<div style="text-align: center;">

UMERegRobust - Universal Manifold Embedding Compatible Features for Robust Point Cloud Registration (ECCV 2024)

ECCV 2024 PWC

</div> <hr> <p align="center"> <img src="assets/pc_example.png" alt="Image 1" width="45%"/> <img src="assets/teaser.png" alt="Image 2" width="45%"/> </p>

In this work, we adopt the Universal Manifold Embedding (UME) framework for the estimation of rigid transformations and extend it, so that it can accommodate scenarios involving partial overlap and differently sampled point clouds. UME is a methodology designed for mapping observations of the same object, related by rigid transformations, into a single low-dimensional linear subspace. This process yields a transformation-invariant representation of the observations, with its matrix form representation being covariant (i.e. equivariant) with the transformation. We extend the UME framework by introducing a UME-compatible feature extraction method augmented with a unique UME contrastive loss and a sampling equalizer. These components are integrated into a comprehensive and robust registration pipeline, named UMERegRobust. We propose the RotKITTI registration benchmark, specifically tailored to evaluate registration methods for scenarios involving large rotations. UMERegRobust achieves better than state-of-the-art performance on the KITTI benchmark, especially when strict precision of $(1^\circ, 10cm)$ is considered (with an average gain of +9%), and notably outperform SOTA methods on the RotKITTI benchmark (with +45% gain compared the most recent SOTA method).

Arxiv Link: https://www.arxiv.org/abs/2408.12380 <br> Paper Link: ECCV2024 Springer Version <br>

<hr>

Method Overview

Method

<hr>

Environment Setup

Code was tested on:

Special Packages Used:

Create Env:

# Create Conda Env
conda create umereg_conda_env python=3.8

# Install CUDA Toolkit 11.7
conda install nvidia/label/cuda-11.7.0::cuda-toolkit
conda install conda-forge::cudatoolkit-dev

# Git for Conda
conda install git

# Install Pytorch 1.13.0+cu117
pip install torch==1.13.0+cu117 torchvision==0.14.0+cu117 torchaudio==0.13.0 --extra-index-url https://download.pytorch.org/whl/cu117

# Install MinkowskiEngine 
pip install -U git+https://github.com/NVIDIA/MinkowskiEngine -v --no-deps --config-settings="--blas_include_dirs=${CONDA_PREFIX}/include" --config-settings="--blas=openblas"

# Install Pytorch3D + torch_scatter
pip install "git+https://github.com/facebookresearch/pytorch3d.git"
pip install torch-scatter -f https://data.pyg.org/whl/torch-1.13.0+cu117.html

# NKSR
pip install -U nksr -f https://nksr.huangjh.tech/whl/torch-1.13.0+cu117.html

# Other Relevant Packages
pip install open3d
pip install tensorboard

Clone UMERegRobust Repository:

git clone https://github.com/yuvalH9/UMERegRobust.git
<hr>

Datasets

You can evaluate or train UMERegRobust on both the KITTI dataset and the nuScenes dataset.

Please refer to the detailed datasets guidelines:

<hr>

Sampling Equalizer Module (SEM) Preprocessing

To use the SEM to preprocess the input point cloud please use:

python datasets/sem_preprocessing.py --dataset_mode [kitti\nuscenes] --split [train\val] --data_path path_to_input_data --output_path path_to_output

We also supply download links to the SEM already preprocessed data for both KITTI (test, lokitt, rotkitti) and nuScenes (test, lonuscenes, rotnuscens) registration benchmarks.

<hr>

RotKITTI & RotNuscenes Registration Benchmarks

We suggest new registration benchmarks RotKITTI and RotNuscenes, these benchmarks focus on point cloud pairs with big relative rotations in the wild (not synthetic rotations). Each benchmark contains registration problems with relative rotations ranging between 30-180 degrees. We encourage the comunity to test thier method on those benchmakrs.

To use the benchmarks, first download the KITTI \ nuScenes datasets as described in section Datasets. Next, the registration problems (source-target pairs) are saved in the files rotkitti_metadata.npy and rotnuscenes_metadata.npy, along with there corresponding GT transformations in the files rotkitti_gt_tforms.npy and rotnuscenes_metadata.npy, respectively.

<hr>

Usage

Eval

  1. Download the original data as described in section Datasets to data_path.
  2. Download the SEM preprocessed data as described in section SEM Preprocessing to cache_data_path.
  3. Update paths in relevant benchmark config files.
  4. Evaluate KITTI benchmarks:
    python evaluate.py --benchmark [kitti_test\lokitti\rotkitti]
    
  5. Evaluate nuScenes benchmarks:
    python evaluate.py --benchmark [nuscenes_test\lonuscenes\rotnuscenes]
    

Train

  1. Download the original data as described in section Datasets to data_path.
  2. Run the SEM preprocessing for train and val splits as described in section SEM Preprocessing output data to cache_data_path.
  3. Update paths in relevant train config files.
  4. Train KITTI:
    python train_coloring.py --config kitti
    
  5. Train nuScenes benchmarks:
    python train_coloring.py --config nuscenes
    
<hr>

Results - KITTI Benchmarks

KITTI Test

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF75.173.1
Predetor88.258.7
CoFiNet83.256.4
GeoTrans66.362.6
GCL93.978.6
UMERegRobust94.387.8

Table1: KITTI Benchmark - Registration Recall [%]

RotKITTI

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF11.63.6
Predetor41.635.0
CoFiNet62.530.1
GeoTrans78.550.1
GCL40.128.8
UMERegRobust81.173.3

Table2: RotKITTI Benchmark - Registration Recall [%]

LoKITTI

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF17.26.9
Predetor33.728.4
CoFiNet11.21.0
GeoTrans37.87.2
GCL72.326.9
UMERegRobust59.330.2

Table3: LoKITTI Benchmark - Registration Recall [%]

<hr>

Results - nuScenes Benchmarks

nuScenes Test

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF58.237.8
Predetor53.948.1
CoFiNet62.356.1
GeoTrans70.737.9
GCL82.067.5
UMERegRobust85.576.0

Table4: nuScenes Benchmark - Registration Recall [%]

RotNuscenes

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF5.55.2
Predetor16.515.7
CoFiNet27.023.6
GeoTrans34.313.1
GCL21.019.6
UMERegRobust51.939.7

Table5: RotNuScenes Benchmark - Registration Recall [%]

LoNuscenes

MethodNormal Precision <br/>(1.5°, 30 cm)Strict Precision <br/>(1°, 10 cm)
FCGF1.90.0
Predetor35.64.2
CoFiNet30.323.5
GeoTrans48.117.3
GCL62.35.6
UMERegRobust70.856.3

Table6: LoNuScenes Benchmark - Registration Recall [%]

<hr>

Citation

If you find this work useful, please cite:

@inproceedings{haitman2025umeregrobust,
  title={UMERegRobust-Universal Manifold Embedding Compatible Features for Robust Point Cloud Registration},
  author={Haitman, Yuval and Efraim, Amit and Francos, Joseph M},
  booktitle={European Conference on Computer Vision},
  pages={358--374},
  year={2025},
  organization={Springer}
}
<hr>