Awesome
Curious Representation Learning for Embodied Intelligence
This is the pytorch code for the paper Curious Representation Learning for Embodied Intelligence. This codebase is based on the codebase from Habitat-lab, please see HABITAT_README.md for installation instructions for the repository.
Interactive Pretraining of Embodied Agents
To pretrain agent weights on Matterport3D, please use the following command:
python habitat_baselines/run.py --run-type=train --exp-config habitat_baselines/cvpr_config/pretrain/curiosity_pointnav_pretrain.yaml
The other configs used in the paper may also be found in habitat_baselines/cvpr_config/pretrain.
Downstream ImageNav Pretraining
To finetune weights on ImageNav, please use the following command:
python habitat_baselines/run.py --run-type=train --exp-config habitat_baselines/cvpr_config/imagenav/curiosity_pointnav_gibson_imagenav.yaml
Downstream ObjectNav Pretraining
To finetune weights on ObjectNav, please use the following command:
python habitat_baselines/run.py --run-type=train --exp-config habitat_baselines/cvpr_config/objectnav/curiosity_pointnav_mp3d_objectnav.yaml
Pretrained Weights
The pretrained CRL model from the Matterport3D environment can be downloaded from here
Citing Our Paper
If you find our code useful for your research, please consider citing the following paper, as well as papers included in HABITAT_README.md.
@inproceedings{du2021curious,
author = {Du, Yilun and Gan, Chuang and
Isola, Phillip},
title = {Curious Representation Learning
for Embodied Intelligence},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
year = {2021}
}