Awesome
<h1 align="center"> Neural Grasp Distance Fields for Robot Manipulation </h1> <div align="center"> <a href="https://thomasweng.com/">Thomas Weng</a> • <a href="https://davheld.github.io/">David Held</a> • <a href="https://fmeier.github.io/">Franziska Meier</a> • <a href="https://www.mustafamukadam.com/">Mustafa Mukadam</a> </div> <h4 align="center"> <a href="https://sites.google.com/view/neural-grasp-distance-fields"><b>Website</b></a> • <a href="https://arxiv.org/abs/2211.02647"><b>Paper</b></a> </h4> <div align="center"><img height="30" src=".github/meta_ai.jpeg" alt="Meta-AI" /> <img height="40" src=".github/rpad.jpg" alt="rpad" />
</div>Setup
-
Clone the repository:
git clone --recursive git@github.com:facebookresearch/NGDF.git
-
Create a conda environment and install package dependencies. Note: mamba is highly recommended as a drop-in replacement for conda.
cd NGDF bash install.sh
Install PyTorch separately, based on your CUDA driver version. The command below was tested on a 3080/3090 with CUDA 11.1:
source prepare.sh pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
Run
source prepare.sh
before running anyngdf
training or evaluation code to activate the environment and set env variables.
Folder structure
NGDF
├── acronym # Submodule with utilities for ACRONYM dataset
├── contact_graspnet # Submodule with ContactGraspnet for baselines
├── data # Datasets, models, and evaluation output
├── differentiable-robot-model # Submodule for differentiable FK
├── ndf_robot # Submodule for pre-trained shape embedding
├── ngdf # Code for training and evaluating NGDF networks
├── OMG-Planner # Submodule with pybullet env, reach and grasp evaluation
├── scripts # Scripts for running training and evaluation
└── theseus # Submodule for differentiable FK and SE(3) ops
Grasp Level Set Optimization Evaluation
-
Download datasets
acronym_perobj
andacronym_multobj
from this Google Drive link. Place the datasets indata/
.The datasets are required to compute the closest grasp metric and are also used in training.
-
Run evaluation
- Download pre-trained models and configs into
data/models
from this link - Download object rotations into
data
from this link - Run grasp level set evaluations:
bash scripts/eval/grasp_level_set/perobj.sh bash scripts/eval/grasp_level_set/multobj.sh
Results are stored in
eval/
in each model dir.To evaluate the grasps in pybullet, you'll need to install the code in the following section, then run the above commands with a
-p
flag:bash scripts/eval/grasp_level_set/perobj.sh -p
- Download pre-trained models and configs into
Reaching and Grasping Evaluation
-
Set up dependencies
-
OMG-Planner, follow instructions in OMG-Planner README
OMG-Planner/README.md
-
pip install "git+https://github.com/facebookresearch/pytorch3d.git@stable"
-
differentiable-robot-model
cd differentiable-robot-model git remote add parent https://github.com/facebookresearch/differentiable-robot-model.git git fetch parent python setup.py develop
-
Contact-GraspNet
cd contact_graspnet conda env update -f contact_graspnet_env_tf25.yml sh compile_pointnet_tfops.sh pip install -e .
Download trained model
scene_test_2048_bs3_hor_sigma_001
from here and copy it into thecheckpoints/
folder.
-
-
Run evaluation script
bash scripts/eval/reach_and_grasp/perobj.sh
The results are saved in
data/pybullet_eval
. Get summary results in jupyter notebookjupyter notebook --notebook-dir=scripts/eval/reach_and_grasp
NGDF Training
- Single object model training:
bash scripts/train/perobj_Bottle.sh bash scripts/train/perobj_Bowl.sh bash scripts/train/perobj_Mug.sh
- Multi-object model training;
bash scripts/train/multobj_Bottle.sh
Docker instructions
- Building docker
cd NGDF docker build -t ngdf .
- Run docker
bash docker_run.sh
source prepare.sh
- Run the same commands for training in the container under
root:/workspace/NGDF#
Bibtex
@article{weng2022ngdf,
title={Neural Grasp Distance Fields for Robot Manipulation},
author={Weng, Thomas and Held, David and Meier, Franziska and Mukadam, Mustafa},
journal={IEEE International Conference on Robotics and Automation (ICRA)},
year={2023}
}
License
The majority of NGDF is licensed under MIT license, however a portion of the project is available under separate license terms: ContactGraspNet is licensed under a non-commericial NVidia License.
Contributing
We actively welcome your pull requests! Please see CONTRIBUTING.md and CODE_OF_CONDUCT.md for more info.