Home

Awesome

Full Transformer Framework for Robust Point Cloud Registration with Deep Information Interaction

DIT-architecture

This repository contains python scripts for training and testing [Deep Interaction Transformer (DIT)]

Deep Interaction Transformer (DIT) is a full Transformer framework for point cloud registration, which achieves superior performance compared with current state-of-the-art learning-based methods in accuracy and robustness. DIT consists of the following three main modules:

Configuration

This code is based on PyTorch implementation, and tested on:

You can install the python requirements on your system with:

cd DIT
pip install -r requirements.txt

Training

You can run the relevant commands under the /DIT path to train the DIT model in a specific environment, the network parameters will be saved in the /models folder. Specifically, we use the ModelNet40 dataset for this work, which will be automatically downloaded if necessary.

Train the DIT on the clean, low noise partial, high noise partial point clouds as

sh experiments/1_train_clean.sh
sh experiments/1_train_low_noise_partial.sh
sh experiments/1_train_high_noise_partial.sh

Evaluation

We provide

You can run the relevant commands under the /DIT path to evaluate the DIT model in a specific environment. If you want to evaluate your training results, you can change the model path in the sh file directly

Evaluate the DIT on the clean, low noise partial, high noise partial point clouds as

sh experiments/1_eval_clean.sh
sh experiments/1_eval_low_noise_partial.sh
sh experiments/1_eval_high_noise_partial.sh

You can run the relevant commands under the /DIT path to visualize the DIT registration process in a specific environment as

sh experiments/1_eval_clean_vis.sh
sh experiments/1_eval_low_noise_partial_vis.sh
sh experiments/1_eval_high_noise_partial_vis.sh