Home

Awesome

This is the official implementation of our paper:

BundleTrack: 6D Pose Tracking for Novel Objects without Instance or Category-Level 3D Models

accepted in International Conference on Intelligent Robots and Systems (IROS) 2021.

Abstract

Most prior 6D object pose tracking often assume that the target object's CAD model, at least at a category-level, is available for offline training or during online template matching. This work proposes BundleTrack, a general framework for 6D pose tracking of novel objects, which does not depend upon 3D models, either at the instance or category-level. It leverages the complementary attributes of recent advances in deep learning for segmentation and robust feature extraction, as well as memory-augmented pose graph optimization for spatiotemporal consistency. This enables long-term, low-drift tracking under various challenging scenarios, including significant occlusions and object motions. Comprehensive experiments given two public benchmarks demonstrate that the proposed approach significantly outperforms state-of-art, category-level 6D tracking or dynamic SLAM methods. When compared against state-of-art methods that rely on an object instance CAD model, comparable performance is achieved, despite the proposed method's reduced information requirements. An efficient implementation in CUDA provides a real-time performance of 10Hz for the entire framework.

<p float="left"> <img src="./media/vis_scene_1_method_ours_c.gif" width="300" /> <img src="./media/vis_video_bleach0_method_ours_c.gif" width="300" /> </p>

This repo can be readily applied to 6D pose tracking for novel unknown objects. For CAD model-based 6D pose tracking, please check out my another repository of se(3)-TrackNet

Bibtex

@inproceedings{wen2021bundletrack,
  title={BundleTrack: 6D Pose Tracking for Novel Objects without Instance or Category-Level 3D Models},
  author={Wen, B and Bekris, Kostas E},
  booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems},
  year={2021}
}

Supplementary Video

Click to watch

<img src="./media/supplementary_frontpage.jpg" width="400">

IROS 2021 Presentation

Click to watch

<img src="./media/presentation_firstpage.jpg" width="400">

Results

<img src="./media/nocs_results.png" width="500"> <img src="./media/ycbineoat_results.png" width="2000">

Benchmark Output Results

For convenience of benchmarking and making plots, results of pose outputs can be downloaded below

Setup

For the environment setup, it's strongly recommended to use our provided docker environment (setting up from scratch is very complicated and not supported in this repo). For this, you don't have to know how docker works. Only some basic commands are needed and will be provided in the below steps.

Data

Depending on what you want to run, download those data that are neccessary.

Run predictions on NOCS

Run predictions on YCBInEOAT

Run predictions on your own RGBD data