Home

Awesome

ESVO Extension

1. Overview

We extend the ESVO framework to three other modules:

The usage of these modules are listed below:

1.1 esvo_MVSMono.cpp

We modify EMVS monocular multi-view monocular mapping (given the Groundtruth Pose) under the ESVO framework. To launch the mapper, run

$ roslaunch esvo_core mvsmono_xxx.launch

This will launch one esvo_time_surface nodes (for the left event camera), and the mapping node simultaneously. The time surface is not used in the mapper, but for the API compatible with the original ESVO.

Video: from top to bottom: raw image, confidence map, mask, depth map

<a target="_blank"><img src="./pict/MVSMONO_simu_office_planar.gif" alt="" width="420" height="300" /></a>

1.2 esvo_MonoMapping.cpp and esvo_MonoTracking.cpp

We follow EVO to integrate the monocular mapper with the time surface-based tracker for the monocular event-based visual odometry. Preliminary results are provided. To launch the mapper, run

$ roslaunch esvo_core monosystem_xxx.launch

1.3 Tracking.cpp

ESVO trackles the tracking problem using the time surface (TS), while EVO does it on the binary event map (EM). This module additionally implements the EM-basd tracker for a comparative evaluation. Complete experimental results will be published.

<!-- Table: mean ATE [*cm*] of all tracker variations under 10 SLAM trials <a target="_blank"><img src="./pict/tracker_comparison.png" alt="" width="630" height="277.374" /></a> -->

Video: from left to right: TS-based, EM-based, TSEM-based tracker.

<a target="_blank"><img src="./pict/Tracking_upenn_flying3.gif" alt="" width="630" height="314.48" /></a>

2. Simulated Datasets

We use the event camera-based simulator: ESIM to collect several simulated stereo event camera-based sequences. The stereo rig perform planar or 6DoF motion before a wall with different backgrounds: simple shapes, checkerboard, and office. These sequences can be used for algorithm verification. They can be downloaded here.

We write a script to perform batch tests and evaluation. To run the ESVO with trackers on different event representations:

$ python run_esvo.py -dataset=rpg_stereo -sequence=rpg_bin -representation=TS,EM,TSEM -eventnum=2000,3000,4000 -trials=1 -program=run,eval,load_result

We use this package: rpg_trajectory_evaluation to compute the RMSE and RPE.

3. Acknoledgement

Thanks again for authors' great work on ESVO!




ESVO: Event-based Stereo Visual Odometry

ESVO is a novel pipeline for real-time visual odometry using a stereo event-based camera. Both the proposed mapping and tracking methods leverage a unified event representation (Time Surfaces), thus, it could be regarded as a ''direct'', geometric method using raw event as input.

Please refer to the ESVO Project Page for more detailed information and for testing event data.

Related Publications

1. Installation

We have tested ESVO on machines with the following configurations

For Ubuntu 16.04, you may need to upgrade your cmake.

1.1 Driver Installation

To work with event cameras, especially for the Dynamic Vision Sensors (DVS/DAVIS), you need to install some drivers. Please follow the instructions (steps 1-9) at rpg_dvs_ros before moving on to the next step. Note that you need to replace the name of the ROS distribution with the one installed on your computer.

We use catkin tools to build the code. You should have it installed during the driver installation.

1.2 Dependencies Installation

You should have created a catkin workspace in Section 1.1. If not, please go back and create one.

Clone this repository into the src folder of your catkin workspace.

$ cd ~/catkin_ws/src 
$ git clone https://github.com/HKUST-Aerial-Robotics/ESVO.git

Dependencies are specified in the file dependencies.yaml. They can be installed with the following commands from the src folder of your catkin workspace:

$ cd ~/catkin_ws/src
$ sudo apt-get install python3-vcstool
$ vcs-import < ESVO/dependencies.yaml

The previous command should clone the the repositories into folders called catkin_simple, glog_catkin, gflags_catkin, minkindr, etc. inside the src folder of your catking workspace, at the same level as this repository (ESVO).

You may need autoreconf to compile glog_catkin. To install autoreconf, run

$ sudo apt-get install autoreconf

yaml-cpp is only used for loading calibration parameters from yaml files:

$ cd ~/catkin_ws/src 
$ git clone https://github.com/jbeder/yaml-cpp.git
$ or download from https://github.com/jbeder/yaml-cpp/releases/tag/release-0.5.1
$ cd yaml-cpp
$ mkdir build && cd build && cmake -DYAML_BUILD_SHARED_LIBS=ON ..
$ make -j

Other ROS dependencies should have been installed in Section 1.1. If not by accident, install the missing ones accordingly. Besides, you also need to have OpenCV (3.2 or later) and Eigen 3 installed.

1.3 ESVO Installation

After cloning this repository, as stated above (reminder)

$ cd ~/catkin_ws/src 
$ git clone https://github.com/HKUST-Aerial-Robotics/ESVO.git

run

$ catkin build esvo_time_surface esvo_core
$ source ~/catkin_ws/devel/setup.bash

2. Usage

To run the pipeline, you need to download rosbag files from the ESVO Project Page.

2.1 esvo_time_surface

This package implements a node that constantly updates the stereo time maps (i.e., time surfaces). To launch it independently, open a terminal and run the command:

$ roslaunch esvo_time_surface stereo_time_surface.launch

To play a bag file, go to esvo_time_surface/launch/rosbag_launcher and modify the path in [bag_name].launch according to where your rosbag file is downloaded. Then execute

$ roslaunch esvo_time_surface [bag_name].launch

2.2 esvo_core

This package implements the proposed mapping and tracking methods. The initialization is implemented inside the mapping part. To launch the system, run

$ roslaunch esvo_core system_xxx.launch

This will launch two esvo_time_surface nodes (for left and right event cameras, respectively), the mapping node and the tracking node simultaneously. Then play the input (already downloaded) bag file by running

$ roslaunch esvo_time_surface [bag_name].launch

To save trajectories at anytime, go to another terminal and terminate the system by

$ rosparam set /ESVO_SYSTEM_STATUS "TERMINATE"

You need to set the path in /cfg/tracking_xxx.yaml to which the result file will be saved.

2.3 esvo_core/mvstereo

This module implements the mapper of ESVO and some other event-based mapping methods (e.g. [26], [45]). As a multi-view stereo (MVS) pipeline, it assumes that poses are known as prior. To launch the mapper, run

$ roslaunch esvo_core mvstereo_xxx.launch

This will launch two esvo_time_surface nodes (for left and right event cameras, respectively), and the mapping node simultaneously. Then play the input (already downloaded) bag file by running

$ roslaunch esvo_time_surface [bag_name].launch

Note that only rpg and upenn datasets are applicable for this module because they come with the ground truth poses.

3. Parameters (Dynamic Reconfigure)

Time Surface

Mapping

Event Matching

Block Matching

Non-linear Optimization parameters

Tracking

4. Notes for Good Results

Real-time performance is witnessed on a Razor Blade 15 laptop (Intel® Core™ i7-8750H CPU @ 2.20GHz × 12).

and modify the rate of the external clock (usd for synchronizing the stereo time surfaces) accordingly, e.g.

`<node name="global_timer" pkg="rostopic" type="rostopic" args="pub -s -r 70 /sync std_msgs/Time 'now' ">`

In this example, the bag file is played at a factor of 0.5, and thus, the synchronization signal is set to 50 Hz accordingly. These modifications must be made accordingly such that the time surface is updated (refreshed) at 100 Hz in simulation time. You can check this by running,

$ rostopic hz /TS_left $ rostopic hz /TS_right

They are both supposed to be approximately 100 Hz.

5. Datasets

The event data fed to ESVO needs to be recorded at remarkbly higher streaming rate than that in the default configuration (30 Hz) of the rpg_dvs_ros driver. This is due to the fact that the esvo_time_surface operates at 100 Hz. To refresh the time surfaces with the most current events to the utmost, a notably higher streaming rate is needed (e.g., 1000 Hz). The streaming rate can be either simply set in the hardware or modified via rewriting the bag. We provide a naive example in /rosbag_editor to show how.

For convenience we provide a number of bag files, which have been rewritten to meet above requirement. They can be downloaded from the ESVO Project Page.

6. License

ESVO is licensed under the GNU General Public License Version 3 (GPLv3), see http://www.gnu.org/licenses/gpl.html.

For commercial use, please contact Yi Zhou and Shaojie Shen.

Email addresses are available in the project page.

7. Log