Home

Awesome

EMSGC: Event-based Motion Segmentation with Spatio-Temporal Graph Cuts

EMSGC provides a solution to identify independently moving objects acquired with an event-based camera, i.e., to solve the event-based motion segmentation problem. The problem is cast as an energy minimization one involving the fitting of multiple motion models. EMSGC jointly solves two subproblems, namely eventcluster assignment (labeling) and motion model fitting, in an iterative manner by exploiting the structure of the input event data in the form of a spatio-temporal graph.

Please refer to the EMSGC Project Page for more detailed information and for testing event data.

Video

VIDEO EMSGC

Publication

This is the code for the IEEE TNNLS paper:

If you use any of this code, please cite the following publication:

@Article{Zhou21tnnls,
  title   = {Event-based Motion Segmentation with Spatio-Temporal Graph Cuts},
  author  = {Zhou, Yi and Gallego, Guillermo and Lu, Xiuyuan and Liu, Siqi and Shen, Shaojie},
  journal = {{IEEE} Transactions on Neural Networks and Learning Systems},
  year    = {2023}
  volume  = {34},
  number  = {8},
  pages   = {4868--4880},
  doi     = {10.1109/TNNLS.2021.3124580}  
}

Also note that the implementation of event warping and contrast maximization is based on dvs_global_flow. Please cite the corresponding publications if you use them.

1. Installation

We have tested our code on machines with the following configurations

1.1 Driver Installation

To work with event cameras, in particular the Dynamic Vision Sensors (DVS/DAVIS), you need to install some drivers. Please follow the instructions (steps 1-9) at rpg_dvs_ros before moving on to the next step. Note that you need to replace the name of the ROS distribution with the one installed on your computer. We use catkin tools to build the code. You should have it installed during the driver installation.

1.2 Dependencies Installation

You should have created a catkin workspace in Section 1.1. If not, please go back and create one.

Clone this repository into the src folder of your catkin workspace. In a terminal, run:

cd ~/catkin_ws/src 
git clone https://github.com/HKUST-Aerial-Robotics/EMSGC.git

Dependencies are specified in the file dependencies.yaml. They can be installed with the following commands from the src folder of your catkin workspace:

cd ~/catkin_ws/src
sudo apt-get install python3-vcstool
vcs-import < EMSGC/dependencies.yaml

The previous command should clone the repositories into folders called catkin_simple, glog_catkin, gflags_catkin, minkindr, etc. inside the src folder of your catkin workspace, at the same level as this repository (EMSGC).

You may need autoreconf to compile glog_catkin. To install autoreconf, run

sudo apt-get install autoconf

Note that above command may change on different version of Ubuntu. Please refer to https://askubuntu.com/a/269423 for details.

yaml-cpp is only used for loading calibration parameters from yaml files:

cd ~/catkin_ws/src 
git clone https://github.com/jbeder/yaml-cpp.git
cd yaml-cpp
mkdir build && cd build && cmake -DYAML_BUILD_SHARED_LIBS=ON ..
make -j

Other ROS dependencies should have been installed in Section 1.1. If not by accident, install the missing ones accordingly. Besides, you also need to have OpenCV (3.2 or later) and Eigen 3 installed.

Furthermore, the GNU Scientific Library (GSL) needs to be installed. If missing, you can install it with

sudo apt-get install libgsl-dev

1.3 Installation

After cloning this repository, as stated above (reminder)

cd ~/catkin_ws/src 
git clone https://github.com/HKUST-Aerial-Robotics/EMSGC.git

run

catkin build emsgc
source ~/catkin_ws/devel/setup.bash

2. Usage

First you need to download rosbag files from the EMSGC Project Page.

Once you have the data ready, go to the launch file and adapt the paths to your setup, including:

Then run e.g.,

$ roslaunch emsgc box_seq00.launch

3. Parameters

MRF Parameters

Initialization

You may set the verbosity / printing level in the command line directly, by settign the value of variable GLOG_v (>= 0). Example:

env GLOG_v=2 roslaunch emsgc box_seq00.launch

5. Data

Data can be downloaded from the Project page.

6. FAQs

7. License

EMSGC is licensed under the GNU General Public License Version 3 (GPLv3), see http://www.gnu.org/licenses/gpl.html.

For commercial use, please contact Yi Zhou and Shaojie Shen. Email: eeyzhou@hnu.edu.cn; eeshaojie@ust.hk.

8. Additional Resources on Event Cameras

Event-based Stereo Visual Odometry

Event-based Vision Survey Paper