Home

Awesome

Related Works and Extended Application

SLAM:

  1. ikd-Tree: A state-of-art dynamic KD-Tree for 3D kNN search.
  2. R2LIVE: A high-precision LiDAR-inertial-Vision fusion work using FAST-LIO as LiDAR-inertial front-end.
  3. LI_Init: A robust, real-time LiDAR-IMU extrinsic initialization and synchronization package..
  4. FAST-LIO-LOCALIZATION: The integration of FAST-LIO with Re-localization function module.

Control and Plan:

  1. IKFOM: A Toolbox for fast and high-precision on-manifold Kalman filter.
  2. UAV Avoiding Dynamic Obstacles: One of the implementation of FAST-LIO in robot's planning.
  3. UGV Demo: Model Predictive Control for Trajectory Tracking on Differentiable Manifolds.
  4. Bubble Planner: Planning High-speed Smooth Quadrotor Trajectories using Receding Corridors.
<!-- 10. [**FAST-LIVO**](https://github.com/hku-mars/FAST-LIVO): Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry. -->

FAST-LIO

FAST-LIO (Fast LiDAR-Inertial Odometry) is a computationally efficient and robust LiDAR-inertial odometry package. It fuses LiDAR feature points with IMU data using a tightly-coupled iterated extended Kalman filter to allow robust navigation in fast-motion, noisy or cluttered environments where degeneration occurs. Our package address many key issues:

  1. Fast iterated Kalman filter for odometry optimization;
  2. Automaticaly initialized at most steady environments;
  3. Parallel KD-Tree Search to decrease the computation;

FAST-LIO 2.0 (2021-07-05 Update)

<!-- ![image](doc/real_experiment2.gif) --> <!-- [![Watch the video](doc/real_exp_2.png)](https://youtu.be/2OvjGnxszf8) --> <div align="left"> <img src="doc/real_experiment2.gif" width=49.6% /> <img src="doc/ulhkwh_fastlio.gif" width = 49.6% > </div>

Related video: FAST-LIO2, FAST-LIO1

Pipeline:

<div align="center"> <img src="doc/overview_fastlio2.svg" width=99% /> </div>

New Features:

  1. Incremental mapping using ikd-Tree, achieve faster speed and over 100Hz LiDAR rate.
  2. Direct odometry (scan to map) on Raw LiDAR points (feature extraction can be disabled), achieving better accuracy.
  3. Since no requirements for feature extraction, FAST-LIO2 can support many types of LiDAR including spinning (Velodyne, Ouster) and solid-state (Livox Avia, Horizon, MID-70) LiDARs, and can be easily extended to support more LiDARs.
  4. Support external IMU.
  5. Support ARM-based platforms including Khadas VIM3, Nivida TX2, Raspberry Pi 4B(8G RAM).

Related papers:

FAST-LIO2: Fast Direct LiDAR-inertial Odometry

FAST-LIO: A Fast, Robust LiDAR-inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter

Contributors

Wei Xu 徐威Yixi Cai 蔡逸熙Dongjiao He 贺东娇Fangcheng Zhu 朱方程Jiarong Lin 林家荣Zheng Liu 刘政, Borong Yuan

<!-- <div align="center"> <img src="doc/results/HKU_HW.png" width = 49% > <img src="doc/results/HKU_MB_001.png" width = 49% > </div> -->

1. Prerequisites

1.1 Ubuntu and ROS

Ubuntu >= 16.04

For Ubuntu 18.04 or higher, the default PCL and Eigen is enough for FAST-LIO to work normally.

ROS >= Melodic. ROS Installation

1.2. PCL && Eigen

PCL >= 1.8, Follow PCL Installation.

Eigen >= 3.3.4, Follow Eigen Installation.

1.3. livox_ros_driver

Follow livox_ros_driver Installation.

Remarks:

2. Build

If you want to use docker conatiner to run fastlio2, please install the docker on you machine. Follow Docker Installation.

2.1 Docker Container

User can create a new script with anyname by the following command in linux:

touch <your_custom_name>.sh

Place the following code inside the <your_custom_name>.sh script.

#!/bin/bash
mkdir docker_ws
# Script to run ROS Kinetic with GUI support in Docker

# Allow X server to be accessed from the local machine
xhost +local:

# Container name
CONTAINER_NAME="fastlio2"

# Run the Docker container
docker run -itd \
  --name=$CONTAINER_NAME \
  --user mars_ugv \
  --network host \
  --ipc=host \
  -v /home/$USER/docker_ws:/home/mars_ugv/docker_ws \
  --privileged \
  --env="QT_X11_NO_MITSHM=1" \
  --volume="/etc/localtime:/etc/localtime:ro" \
  -v /dev/bus/usb:/dev/bus/usb \
  --device=/dev/dri \
  --group-add video \
  -v /tmp/.X11-unix:/tmp/.X11-unix \
  --env="DISPLAY=$DISPLAY" \
  kenny0407/marslab_fastlio2:latest \
  /bin/bash

execute the following command to grant execute permissions to the script, making it runnable:

sudo chmod +x <your_custom_name>.sh

execute the following command to download the image and create the container.

./<your_custom_name>.sh

Script explanation:

2.2 Build from source

Clone the repository and catkin_make:

    cd ~/$A_ROS_DIR$/src
    git clone https://github.com/hku-mars/FAST_LIO.git
    cd FAST_LIO
    git submodule update --init
    cd ../..
    catkin_make
    source devel/setup.bash

3. Directly run

Noted:

A. Please make sure the IMU and LiDAR are Synchronized, that's important.

B. The warning message "Failed to find match for field 'time'." means the timestamps of each LiDAR points are missed in the rosbag file. That is important for the forward propagation and backwark propagation.

C. We recommend to set the extrinsic_est_en to false if the extrinsic is give. As for the extrinsic initiallization, please refer to our recent work: Robust Real-time LiDAR-inertial Initialization.

3.1 For Avia

Connect to your PC to Livox Avia LiDAR by following Livox-ros-driver installation, then

    cd ~/$FAST_LIO_ROS_DIR$
    source devel/setup.bash
    roslaunch fast_lio mapping_avia.launch
    roslaunch livox_ros_driver livox_lidar_msg.launch

3.2 For Livox serials with external IMU

mapping_avia.launch theratically supports mid-70, mid-40 or other livox serial LiDAR, but need to setup some parameters befor run:

Edit config/avia.yaml to set the below parameters:

  1. LiDAR point cloud topic name: lid_topic
  2. IMU topic name: imu_topic
  3. Translational extrinsic: extrinsic_T
  4. Rotational extrinsic: extrinsic_R (only support rotation matrix)

3.3 For Velodyne or Ouster (Velodyne as an example)

Step A: Setup before run

Edit config/velodyne.yaml to set the below parameters:

  1. LiDAR point cloud topic name: lid_topic
  2. IMU topic name: imu_topic (both internal and external, 6-aixes or 9-axies are fine)
  3. Set the parameter timestamp_unit based on the unit of time (Velodyne) or t (Ouster) field in PoindCloud2 rostopic
  4. Line number (we tested 16, 32 and 64 line, but not tested 128 or above): scan_line
  5. Translational extrinsic: extrinsic_T
  6. Rotational extrinsic: extrinsic_R (only support rotation matrix)

Step B: Run below

    cd ~/$FAST_LIO_ROS_DIR$
    source devel/setup.bash
    roslaunch fast_lio mapping_velodyne.launch

Step C: Run LiDAR's ros driver or play rosbag.

3.4 PCD file save

Set pcd_save_enable in launchfile to 1. All the scans (in global frame) will be accumulated and saved to the file FAST_LIO/PCD/scans.pcd after the FAST-LIO is terminated. pcl_viewer scans.pcd can visualize the point clouds.

Tips for pcl_viewer:

    1 is all random
    2 is X values
    3 is Y values
    4 is Z values
    5 is intensity

4. Rosbag Example

4.1 Livox Avia Rosbag

<div align="left"> <img src="doc/results/HKU_LG_Indoor.png" width=47% /> <img src="doc/results/HKU_MB_002.png" width = 51% >

Files: Can be downloaded from google drive

Run:

roslaunch fast_lio mapping_avia.launch
rosbag play YOUR_DOWNLOADED.bag

4.2 Velodyne HDL-32E Rosbag

NCLT Dataset: Original bin file can be found here.

We produce Rosbag Files and a python script to generate Rosbag files: python3 sensordata_to_rosbag_fastlio.py bin_file_dir bag_name.bag

Run:

roslaunch fast_lio mapping_velodyne.launch
rosbag play YOUR_DOWNLOADED.bag

5.Implementation on UAV

In order to validate the robustness and computational efficiency of FAST-LIO in actual mobile robots, we build a small-scale quadrotor which can carry a Livox Avia LiDAR with 70 degree FoV and a DJI Manifold 2-C onboard computer with a 1.8 GHz Intel i7-8550U CPU and 8 G RAM, as shown in below.

The main structure of this UAV is 3d printed (Aluminum or PLA), the .stl file will be open-sourced in the future.

<div align="center"> <img src="doc/uav01.jpg" width=40.5% > <img src="doc/uav_system.png" width=57% > </div>

6.Acknowledgments

Thanks for LOAM(J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time), Livox_Mapping, LINS and Loam_Livox.