Awesome
<div align="center"> <h1>Patchwork</h1> <a href="https://github.com/LimHyungTae/patchwork"><img src="https://img.shields.io/badge/-C++-blue?logo=cplusplus" /></a> <a href="https://github.com/LimHyungTae/patchwork"><img src="https://img.shields.io/badge/ROS-Noetic-blue" /></a> <a href="https://github.com/LimHyungTae/patchwork"><img src="https://img.shields.io/badge/Linux-FCC624?logo=linux&logoColor=black" /></a> <a href="https://ieeexplore.ieee.org/document/9466396"><img src="https://img.shields.io/badge/DOI-10.1109/LRA.2021.3093009-004088.svg"/> <br /> <br /> <a href=https://youtu.be/rclqeDi4gow>Video</a> <span> • </span> <a href="https://github.com/LimHyungTae/patchwork?tab=readme-ov-file#requirements">Install by ROS</a> <span> • </span> <a href=https://arxiv.org/abs/2108.05560>Paper</a> <span> • </span> <a href=https://github.com/LimHyungTae/patchwork/wiki>Project Wiki (for beginners)</a> <br /> <br /> <div style="display: flex; justify-content: space-between; width: 100%;"> <img src="img/demo_kitti00_v2.gif" alt="animated" style="width: 90%;" /> <img src="img/demo_terrain_v3.gif" alt="animated" style="width: 90%;" /> </div> <br /> Official page of "Patchwork: Concentric Zone-based Region-wise Ground Segmentation <br /> with Ground Likelihood Estimation Using a 3D LiDAR Sensor", <br /> which is accepted by RA-L with IROS'21 option. </div>IMPORTANT: (Aug. 18th, 2024) I employ TBB, so its FPS is increased from 50 Hz to 100 Hz! If you want to use the paper version of Patchwork for SOTA comparison purpose, Please use this ground seg. benchmark code.
Patchwork | Concept of our method (CZM & GLE) |
---|---|
It's an overall updated version of R-GPF of ERASOR [Code] [Paper].
Characteristics
As shown in the demo videos, our method shows the most promising robust performance compared with other state-of-the-art methods, especially, our method focuses on the little perturbation of precision/recall as shown in this figure.
Please kindly note that the concept of traversable area and ground is quite different! Please refer to our paper.
:open_file_folder: Contents
Test Env.
The code is tested successfully at
- Linux 20.04 LTS
- ROS Noetic
:package: Prerequisite Installation
ROS Setting
-
- Install ROS on a machine.
-
- Thereafter, jsk-visualization is required to visualize Ground Likelihood Estimation status.
(if you use ubuntu 20.04)
sudo apt-get install ros-noetic-jsk-recognition
sudo apt-get install ros-noetic-jsk-common-msgs
sudo apt-get install ros-noetic-jsk-rviz-plugins
(if you use ubuntu 18.04)
sudo apt-get install ros-melodic-jsk-recognition
sudo apt-get install ros-melodic-jsk-common-msgs
sudo apt-get install ros-melodic-jsk-rviz-plugins
-
- Compile compile this package. We use catkin tools,
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone https://github.com/LimHyungTae/patchwork.git
cd .. && catkin build patchwork
:gear: How to Run Patchwork
We provide four examples:
-
How to run Patchwork in SemanticKITTI dataset
- Offline KITTI dataset
- Online (ROS Callback) KITTI dataset
-
How to run Patchwork in your own dataset
- Offline by loading pcd files
- Online (ROS Callback) using your ROS bag file
:chart_with_upwards_trend: Offline KITTI dataset
-
Download SemanticKITTI Odometry dataset (We also need labels since we also open the evaluation code! :)
-
Set the
data_path
inlaunch/offline_kitti.launch
for your machine.
The data_path
consists of velodyne
folder and labels
folder as follows:
data_path (e.g. 00, 01, ..., or 10)
_____velodyne
|___000000.bin
|___000001.bin
|___000002.bin
|...
_____labels
|___000000.label
|___000001.label
|___000002.label
|...
_____...
- Run launch file
roslaunch patchwork offline_kitti.launch
You can directly feel the speed of Patchwork! :wink:
:runner: Online (via your ROS bag file)
It is easy by re-using run_patchwork.launch
.
- First, set your parameter file in
/config
folder and setsensor_height
andsensor_model
appropriately. Other important parameters are
- If you feel too many below parts of non-ground points are considered as ground, lower
th_seeds
andth_dist
. - If your environment is mostly flat, then turn on
using_global_elevation
astrue
. - Change
min_r
,max_r
, andevelation_thresholds
If you are unsure about the sensor_height
, simply launch Patchwork and set the sensor_height
value in the terminal as shown below.
(For this reason, here we set the sensor_height
as 0.6. A centimeter-level error is totally fine.)
- Remap the topic of subscriber, i.g. modify remap line as follows:
<remap from="/patchwork/cloud" to="$YOUR_LIDAR_TOPIC_NAME$"/>
Note that the type subscribed data is sensor_msgs::PointCloud2
.
- Next, launch the roslaunch file as follows:
roslaunch patchwork run_patchwork.launch is_kitti:=false
Note that is_kitti=false
is important! Because it decides which rviz is opened. The rviz shows only estimated ground and non-ground because your own dataset may have no point-wise labels.
- Then play your bag file!
rosbag play $YOUR_BAG_FILE_NAME$.bag
<details>
<summary><strong>Exercise with the <a href="https://github.com/MIT-SPARK/Kimera-Multi-Data">Kimera-Multi dataset</a></strong></summary>
For the Kimera-Multi dataset, you can use the following command:
roslaunch patchwork run_patchwork_kimera_multi.launch
Then, play the bag file as follows:
rosbag play 10_14_acl_jackal2.bag
<div style="display: flex; justify-content: space-between; width: 100%;">
<img src="img/kimera-multi.gif" alt="animated" style="width: 90%;" />
</div>
Even though points are very sparse, Patchwork just works well!
</details>:mag: Own dataset using pcd files
Please refer to /nodes/offilne_own_data.cpp
.
(Note that in your own data format, there may not exist ground truth labels!)
Be sure to set right params. Otherwise, your results may be wrong as follows:
W/ wrong params | After setting right params |
---|---|
For better understanding of the parameters of Patchwork, please read our wiki, 4. IMPORTANT: Setting Parameters of Patchwork in Your Own Env..
Offline (Using *.pcd or *.bin file)
-
Utilize
/nodes/offilne_own_data.cpp
-
Please check the output by following command and corresponding files:
-
Set appropriate absolute file directory, i.e.
file_dir
, inoffline_ouster128.launch
roslaunch patchwork offline_ouster128.launch
Citation
If you use our code or method in your work, please consider citing the following:
@article{lim2021patchwork,
title={Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor},
author={Lim, Hyungtae and Minho, Oh and Myung, Hyun},
journal={IEEE Robotics and Automation Letters},
year={2021}
}
Updates
NEWS (22.12.24)
- Merry christmas eve XD!
include/label_generator
is added to make the.label
file, following the SemanticKITTI format. - The
.label
files can be directly used in 3DUIS benchmark - Thank Lucas Nunes and Xieyuanli Chen for providing code snippets to save a
.label
file.
NEWS (22.07.25)
- Pybinding + more advanced version is now available on Patchwork++ as a preprocessing step for deep learning users (i.e., python users can also use our robust ground segmentation)!
NEWS (22.07.13)
- For increasing convenience of use, the examples and codes are extensively revised by reflecting issue #12.
NEWS (22.05.22)
- The meaning of
elevation_thresholds
is changed to increase the usability. The meaning is explained in wiki. - A novel height estimator, called All-Terrain Automatic heighT estimator (ATAT) is added within the patchwork code, which auto-calibrates the sensor height using the ground points in the vicinity of the vehicle/mobile robot.
- Please refer to the function
consensus_set_based_height_estimation()
.
- Please refer to the function
NEWS (21.12.27)
-
pub_for_legoloam
node for the pointcloud in kitti bagfile is added.ground_estimate.msg
is added
-
Bug in xy2theta function is fixed.
-
How to run
roslaunch patchwork pub_for_legoloam.launch
rosbag play {YOUR_FILE_PATH}/KITTI_BAG/kitti_sequence_00.bag --clock /kitti/velo/pointcloud:=/velodyne_points
- This README about this LiDAR odometry is still incomplete. It will be updated soon!