Awesome
NS3D: Neuro-Symbolic Grounding of 3D Objects and Relations
<br /> <br /> NS3D: Neuro-Symbolic Grounding of 3D Objects and Relations <br /> Joy Hsu, Jiayuan Mao, Jiajun Wu <br /> In Conference on Computer Vision and Pattern Recognition (CVPR) 2023 <br />
Dataset
Our dataset download process follows the ReferIt3D benchmark.
Specifically, you will need to
- (1) Download
sr3d_train.csv
andsr3d_test.csv
from this link - (2) Download scans from ScanNet and process them according to this link. This should result in a
keep_all_points_with_global_scan_alignment.pkl
file.
Setup
Run the following commands to install necessary dependencies.
conda create -n ns3d python=3.7.11
conda activate ns3d
pip install -r requirements.txt
Install Jacinle.
git clone https://github.com/vacancy/Jacinle --recursive
export PATH=<path_to_jacinle>/bin:$PATH
Install the referit3d python package from ReferIt3D.
git clone https://github.com/referit3d/referit3d
cd referit3d
pip install -e .
Compile CUDA layers for PointNet++.
cd models/scene_graph/point_net_pp/pointnet2
python setup.py install
Evaluation
To evaluate NS3D:
scannet=<path_to/keep_all_points_with_global_scan_alignment.pkl>
referit=<path_to/sr3d_train.csv>
load_path=<path_to/model_to_evaluate.pth>
jac-run ns3d/trainval.py --desc ns3d/desc_ns3d.py --scannet-file $scannet --referit3D-file $referit --load $load_path --evaluate
Weights for our trained NS3D model can be found at trained_ns3d.pth and loaded into load_path
.
Training
To train NS3D:
scannet=<path_to/keep_all_points_with_global_scan_alignment.pkl>
referit=<path_to/sr3d_train.csv>
load_path=<path_to/pretrained_classification_model.pth>
jac-run ns3d/trainval.py --desc ns3d/desc_ns3d.py --scannet-file $scannet --referit3D-file $referit --load $load_path --lr 0.0001 --epochs 5000 --save-interval 1 --validation-interval 1
Weights for our pretrained classification model can be found at pretrained_cls.pth and loaded into load_path
.
Acknowledgements
Our codebase is built on top of NSCL and ReferIt3D. Please feel free to email me at joycj@stanford.edu if any problems arise.