Home

Awesome

<br> <p align="center"> <h1 align="center"><strong>BestMan: A Modular Mobile Manipulator Platform for Embodied AI with Unified Simulation-Hardware APIs</strong></h1> <p align="center"> Chongqing University&emsp;&emsp;&emsp;&emsp;Shanghai AI Laboratory&emsp;&emsp;&emsp;&emsp;Xi'an Jiaotong-Liverpool University </p> </p> <div id="top" align="center">

<!-- # BestMan - A Pybullet-based Mobile Manipulator Simulator -->

GitHub license Ubuntu 22.04 Python 3.8 pre-commit Code style: black Imports: isort Document

Welcome to the official repository of BestMan!

A mobile manipulator (with a wheel-base and arm) platform built on PyBullet simulation with unified hardware APIs.

</div>

📋 Contents

💻 Installation

git clone https://github.com/AutonoBot-Lab/BestMan_Pybullet.git
cd BestMan_Pybullet
git submodule update --init

:shamrock: Conda

First install Anaconda or minconda on linux system and then perform the following steps:

cd Install
chmod 777 pythonpath.sh
bash pythonpath.sh
source ~/.bashrc
sudo apt update && sudo apt install ffmpeg
sudo apt update && sudo apt install -y libgl1-mesa-glx libglib2.0-0
sudo mkdir /usr/lib/dri
sudo ln -s /lib/x86_64-linux-gnu/dri/swrast_dri.so /usr/lib/dri/swrast_dri.so
sudo apt install -y build-essential gcc-9 g++-9
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-9 9
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-9 9
sudo update-alternatives --config gcc  # choice gcc-9
sudo update-alternatives --config g++  # choice g++-9

# Make sure gcc and g++ versions are consistent (conda enviroment don't install gcc to prevent problems caused by inconsistent versions)
gcc -v
g++ -v
conda install mamba -n base -c conda-forge
conda(mamba) env create -f basic_env.yaml
conda(mamba) activate BestMan
conda(mamba) env update -f cuda116.yaml
pip install -U git+https://github.com/luca-medeiros/lang-segment-anything.git

  Note:

 You need to get anygrasp license and checkpoint to use it.

 You need export MAX_JOBS=2 in terminal; before pip install if you are running on an laptop due to this issue.

# Install MinkowskiEngine
conda install pytorch=1.13.1 -c pytorch --force-reinstall
pip install -U git+https://github.com/NVIDIA/MinkowskiEngine -v --no-deps --global-option="--blas_include_dirs=${CONDA_PREFIX}/include" --global-option="--blas=openblas"

# Install graspnetAPI
pip install graspnetAPI

# Install pointnet2
cd third_party/pointnet2
python setup.py install

# Force reinstall to ensure version
pip install --force-reinstall opencv-python==4.1.2.30 numpy==1.23.5

:shamrock: Docker

Windows
docker pull ccr.ccs.tencentyun.com/4090/bestman:v1
docker run -it --gpus all --name BestMan ccr.ccs.tencentyun.com/4090/bestman:v1
export DISPLAY=host.docker.internal:0
Linux

👨‍💻 Basic Demos

First, Enter directory Examples:

cd Examples

Below are some examples and their rendering in Blender

:shamrock: Navigation

python navigation_basic.py

<video src="https://github.com/user-attachments/assets/63fe074e-ba27-4de8-8095-99289552b17a"></video>

<br/>

:shamrock: Manipulation

python open_fridge.py

<video src="https://github.com/user-attachments/assets/9aab5ca2-fb09-4b9f-a989-54ef5c1d2884"></video>

<br/>
python open_microwave.py

<video src="https://github.com/user-attachments/assets/d0d97b08-423b-4af5-a418-f36872541f99"></video>

<br/>
python grasp_bowl_on_table_vacuum_gripper.py

<video src="https://github.com/user-attachments/assets/0eb05120-8016-425b-a46a-b711e5290691"></video>

<br/>
python grasp_lego_on_table_gripper.py

<video src="https://github.com/user-attachments/assets/3bf15b13-3113-4a72-950c-7e9c1367ed9e"></video>

<br/>
python move_bowl_from_drawer_to_table.py

<video src="https://github.com/user-attachments/assets/db4c7ec3-c136-4c6a-8323-2bef6bc09c84"></video>

<br/>

blender render

open microwave demo with blender render:

<video src="https://github.com/user-attachments/assets/fb8ef3ea-d045-4bbf-a28f-0bec56930aae"></video>

<br/>

We have improved the pybullet-blender-recorder to import pybullet scene into blender for better rendering

If you want to enable pybullet-blender-recorder, please:

  1. Set blender: Ture in Config/default.yaml

  2. After running the demo, a pkl file will be generated and saved in Examples/record dir

  3. Install the pyBulletSimImporter.py plugin under Visualization/blender-render dir in blender (test on blender3.6.5) , and enalbe this plugin

<img width="1040" alt="image" src="https://github.com/user-attachments/assets/ab9e99c7-64c8-40fe-bbfe-edc0c786b812">
  1. Import the pkl files into blender
<img width="1040" alt="image" src="https://github.com/user-attachments/assets/c0fe66e8-347e-4ecc-b367-8b0c3592d329"> <br/>

📝 TODO List

🤝 Reference

If you find this work useful, please consider citing:

@inproceedings{ding2023task,
  title={Task and motion planning with large language models for object rearrangement},
  author={Ding, Yan and Zhang, Xiaohan and Paxton, Chris and Zhang, Shiqi},
  booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={2086--2092},
  year={2023},
  organization={IEEE}
}

@article{ding2023integrating,
  title={Integrating action knowledge and LLMs for task planning and situation handling in open worlds},
  author={Ding, Yan and Zhang, Xiaohan and Amiri, Saeid and Cao, Nieqing and Yang, Hao and Kaminski, Andy and Esselink, Chad and Zhang, Shiqi},
  journal={Autonomous Robots},
  volume={47},
  number={8},
  pages={981--997},
  year={2023},
  publisher={Springer}
}