Awesome
CKR-nav
Code for our CVPR 2021 paper "Room-and-Object Aware Knowledge Reasoning for Remote Embodied Referring Expression".
Contributed by Chen Gao*, Jinyu Chen*, Si Liu1†, Luting Wang, Qiong Zhang, Qi Wu
Getting Started
Installation
-
Clone this repository.
git clone https://github.com/alloldman/CKR.git $CKR-root
-
Install pytorch==1.3.0
conda install pytorch=1.3.0 cudatoolkit=9.0 torchvision -c pytorch
-
Install the requirements.
pip install -r requirements.txt
Training and Test
Dataset Preparation
-
Download ResNet-152 features for Matterport 3D dataset:
wget https://www.dropbox.com/s/o57kxh2mn5rkx4o/ResNet-152-imagenet.zip -P img_features/ unzip ResNet-152-imagenet.zip
-
Download the Intermediate data from here.
data.zip, cache.zip, img_features.zip, best-ckpt.zip
should be unziped. And theKB:data
should be download and be unziped under the KB folder. -
Put these unziped files as the order below:
CKR ├──data ├──KB │ ├──cache | └─data ├──experiments │ └──best-ckpt └──img_features └──ResNet-152-imagenet.tsv
Training
- Execute the commond below. '
0
' means using the number 0 GPU.
bash run.sh train 0
Test
-
Evalution by our rewritten script and select the best checkpoint. An example evalution on REVERIE dataset as follow. You can change the path to evalution your own checkpoint:
bash run.sh search experiments/best-ckpt/follower_pm_sample2step_imagenet_mean_pooled_1heads_train_iter_9300val_seen_sr_0.547_val_unseen_sr_0.138_ 0
Citation
Please consider citing this project in your publications if it helps your research. The following is a BibTeX reference. The BibTeX entry requires the url LaTeX package.
@inproceedings{gao2021room,
title={Room-and-Object Aware Knowledge Reasoning for Remote Embodied Referring Expression},
author={Gao, Chen and Chen, Jinyu and Liu, Si and Wang, Luting and Zhang, Qiong and Wu, Qi},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year={2021}
}
License
CKR-nav is released under the MIT license. See LICENSE for additional details.
Acknowledge
Some of the codes are built upon REVERIE and babywalk. Thanks them for their great works!