Home

Awesome

SAPIEN: A SimulAted Part-based Interactive ENvironment

SAPIEN is a realistic and physics-rich simulated environment that hosts a large-scale set for articulated objects. It enables various robotic vision and interaction tasks that require detailed part-level understanding. SAPIEN is a collaborative effort between researchers at UCSD, Stanford and SFU. The dataset is a continuation of ShapeNet and PartNet.

Change Log

<details open> <summary>2.2</summary> </details> <details> <summary>2.1</summary> </details> <details> <summary>2.1</summary> </details> <details> <summary>pre2.0</summary> </details> <details> <summary>1 to 2 migration</summary> </details> <details> <summary>1.1</summary> </details> <details> <summary>1.0</summary> </details>

SAPIEN Engine

SAPIEN Engine provides physical simulation for articulated objects. It powers reinforcement learning and robotics with its pure Python interface.

SAPIEN Renderer

SAPIEN provides rasterized and ray traced rendering with Vulkan.

PartNet-Mobility

SAPIEN releases PartNet-Mobility dataset, which is a collection of 2K articulated objects with motion annotations and rendernig material. The dataset powers research for generalizable computer vision and manipulation.

Website and Documentation

SAPIEN Website: https://sapien.ucsd.edu/. SAPIEN Documentation: https://sapien.ucsd.edu/docs/latest/index.html.

Build from source

Before build

Make sure all submodules are initialized git submodule update --init --recursive.

Build with Docker

To build SAPIEN, simply run ./docker_build_wheels.sh. It is not recommended to build outside of our provided docker.

For reference, the Dockerfile is provided here. Note that PhysX needs to be compiled with clang-9 into static libraries before building the Docker image.

Build without Docker

It can be tricky to setup all dependencies outside of a Docker environment. You need to install all dependencies according to the Docker environment. If all dependencies set up correctly, run python setup.py bdist_wheel to build the wheel.

Cite SAPIEN

If you use SAPIEN and its assets, please cite the following works:

@InProceedings{Xiang_2020_SAPIEN,
author = {Xiang, Fanbo and Qin, Yuzhe and Mo, Kaichun and Xia, Yikuan and Zhu, Hao and Liu, Fangchen and Liu, Minghua and Jiang, Hanxiao and Yuan, Yifu and Wang, He and Yi, Li and Chang, Angel X. and Guibas, Leonidas J. and Su, Hao},
title = {{SAPIEN}: A SimulAted Part-based Interactive ENvironment},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}}
@InProceedings{Mo_2019_CVPR,
author = {Mo, Kaichun and Zhu, Shilin and Chang, Angel X. and Yi, Li and Tripathi, Subarna and Guibas, Leonidas J. and Su, Hao},
title = {{PartNet}: A Large-Scale Benchmark for Fine-Grained and Hierarchical Part-Level {3D} Object Understanding},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}
@article{chang2015shapenet,
title={Shapenet: An information-rich 3d model repository},
author={Chang, Angel X and Funkhouser, Thomas and Guibas, Leonidas and Hanrahan, Pat and Huang, Qixing and Li, Zimo and Savarese, Silvio and Savva, Manolis and Song, Shuran and Su, Hao and others},
journal={arXiv preprint arXiv:1512.03012},
year={2015}
}

If you use SAPIEN Realistic Depth generated by SAPIEN's simulated depth sensor, please cite the following work:

@ARTICLE{10027470,
  author={Zhang, Xiaoshuai and Chen, Rui and Li, Ang and Xiang, Fanbo and Qin, Yuzhe and Gu, Jiayuan and Ling, Zhan and Liu, Minghua and Zeng, Peiyu and Han, Songfang and Huang, Zhiao and Mu, Tongzhou and Xu, Jing and Su, Hao},
  journal={IEEE Transactions on Robotics}, 
  title={Close the Optical Sensing Domain Gap by Physics-Grounded Active Stereo Sensor Simulation}, 
  year={2023},
  volume={},
  number={},
  pages={1-19},
  doi={10.1109/TRO.2023.3235591}}