Home

Awesome

metavision_driver

A combined ROS/ROS2 driver for event based cameras using Prophesee's Metavision SDK. This driver is not written or supported by Prophesee.

banner image

If you are looking for more speed and features than the official Prophesee ROS driver you have found the right repository. This driver can cope with the large amount of data produced by Prophesee's Gen3 and Gen4 sensors because it does little more than getting the RAW (currently EVT3 format) events from the camera and publishing them in ROS event_camera_msgs format.

The events can be decoded and displayed using the following ROS/ROS2 packages:

Supported platforms

Tested on the following platforms:

Tested on the following hardware:

Explicitly not supported: any data in the old EVT2 format. The sensor must produce data in the EVT3 format or later.

Installation from binaries

On ROS2 you can install via

sudo apt install ros-${ROS_DISTRO}-metavision-driver

This will also install a version of OpenEB under /opt/ros/${ROS_DISTRO}. Unless you separately install OpenEB you will still have to install the required udev rules from the source tree directory udev/rules.d/, and restart udev, see OpenEB. For the SilkyEV cameras make sure that the plugin version matches the version of the ROS OpenEB vendor package that is installed (check the CMakeLists.txt file), or the camera will not be recognized! If the driver does not find the camera, try building from source (see below).

How to build from source

Prerequisites:

Make sure you have your ROS1 or ROS2 environment sourced such that ROS_VERSION is set. For example for ROS1 noetic:

source /opt/ros/noetic/setup.bash

Create a workspace (metavision_driver_ws), clone this repo, and use vcs to pull in the remaining dependencies:

pkg=metavision_driver
mkdir -p ~/${pkg}_ws/src
cd ~/${pkg}_ws
git clone https://github.com/ros-event-camera/metavision_driver.git src/${pkg}
cd src
vcs import < ${pkg}/${pkg}.repos
cd ..

Optional (ROS1): to use the efficient recording nodelet clone the nodelet_rosbag repository into our src tree:

git clone -b open_bag_later git@github.com:berndpfrommer/nodelet_rosbag.git src/nodelet_rosbag

Now configure and build:

ROS1:

catkin config -DCMAKE_BUILD_TYPE=RelWithDebInfo  # (optionally add -DCMAKE_EXPORT_COMPILE_COMMANDS=1)
catkin build
. devel/setup.bash

ROS2:

colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_EXPORT_COMPILE_COMMANDS=ON
. install/setup.bash

Driver Features

This driver differs from the Prophesee ROS driver in the following ways:

Parameters:

Services:

Dynamic reconfiguration parameters (see MetaVision documentation here):

How to use (ROS1):

roslaunch metavision_driver driver_node.launch   # (run as node)
roslaunch metavision_driver driver_nodelet.launch   # (run as nodelet)

The driver should print out message rate statistics like this:

[ INFO] [1663845360.494651335]: /event_camera: bw in: 106.08359 MB/s, msgs/s in:   51930, out:       0

Prints out the incoming (from the SDK) bandwidth and incoming and published message rate. In multithreaded mode there will also be shown the maximum queue size observed during statistics_print_interval.

To use the combined driver/recording facility:

roslaunch metavision_driver recording_driver.launch bag:=`pwd`/test.bag

Then start/stop recording like this:

rosrun metavision_driver start_recording.py
rosrun metavision_driver stop_recording.py

To visualize the events, run a renderer node from the event_camera_renderer package:

roslaunch event_camera_renderer renderer.launch

The renderer node publishes an image that can be visualized with e.g. rqt_image_view

How to use (ROS2):

Running the driver

You can just start up a ROS2 node:

ros2 launch metavision_driver driver_node.launch.py        # (run as node)

but if you want to efficiently record the events (see below), you should launch instead as a composable node:

ros2 launch metavision_driver driver_composition.launch.py # (run as composable node)

Either way, the printout should be similar to the one for ROS1.

Visualizing the events

To visualize the events, run a renderer node from the event_camera_renderer package:

ros2 launch event_camera_renderer renderer.launch.py

The renderer node publishes an image that can be visualized with e.g. rqt_image_view

Recording on ROS2 versions older than Jazzy

You will need to install the composable recorder into your workspace as well (see below), and launch the combined driver and recorder:

ros2 launch metavision_driver recording_driver.launch.py
ros2 run rosbag2_composable_recorder start_recording.py

To stop the recording, Ctrl-C the recording_driver.launch.py script.

Recording on ROS2 Jazzy or later

Launch the composable driver:

ros2 launch metavision_driver driver_composition.launch.py

and start the recording like this:

ros2 launch metavision_driver start_recording.launch.py

Then stop it like so:

ros2 run metavision_driver stop_recording_ros2.py

Note that the start/stop scripts and launch files need to be adjusted to fit your choice of node names and topics, but if you leave everything default it should work out of the box.

CPU load

Here are some approximate performance numbers on a 16 thread (8-core) AMD Ryzen 7480h laptop with max clock speed of 2.9GHz. All numbers were obtained by producing maximum event rates about (48Mevs) with a SilkyEVCam:

ROS1

All CPU loads below are with sensor saturating at close to 50Mevs.

settingssingle threadedmulti threadednote
driver no subscriber22%59%no pub, extra copy for multithreaded
driver with subscriber37%61%does interprocess communication
driver + rosbag record node70%96%combined driver + record cpu load
driver + rosbag record nodelet52%74%single process no ipc but disk/io

ROS2

All CPU loads below are with sensor saturating at close to 50Mevs. Middleware used was cyclonedds.

settingssingle threadedmulti threadednote
driver no subscriber22%59%no pub, extra copy for multithreaded
driver with subscriber35%44%does interprocess communication
driver + rosbag record node80%90%combined driver + record cpu load
driver + rosbag record composable58%80%single process no ipc but disk/io

About ROS time stamps

The SDK provides hardware event time stamps directly from the camera. For efficiency reasons the packets are not decoded and so the sensor time stamps are not available to the driver. Therefore the ROS driver simply puts the host wall clock arrival time of the first SDK packet into the ROS packet's header stamp field.

External triggering

External triggers on prophesee cameras allows for a signal to be injected into the event stream. This is useful for synchronizing external devices. The event stream contained in the packages will now contain trigger events that can be recovered with the decoder.

Prophesee provides documentation on the trigger functions at a high level here.

Trigger out functionality is exposed through trigger_out_mode, trigger_out_period, and trigger_out_duty_cycle. These variables follow the same meaning as laid out in the internal documentation.

Trigger in functionality is abstracted away from pins to just loopback or external as the pin mappings are constant for a given camera configuration.

WARNING Running synchronization and triggers at the same time is possible, but requires understanding of your camera's underlying hardware (as most share trigger out and sync out pins).

Hardware configuration

The hardware configuration file is config/trigger_pins.yaml. The mappings for hal_plugin_gen* come from Prophesee documentation. The mapping for evc3a_plugin_gen31 has been validated on the SilkyEvCam (March 2022). The mapping goes from the HAL Software Info to pin numbers.

If your camera is not yet supported, the software info is printed out on driver startup. Look for a line that contains:

Plugin Software Name:

This will be the key to place under prophesee_pin_config which can then be populated based on your camera's documentation.

WARNING If this file is not loaded (or your camera is not yet supported), the default pin loaded will be 0. This may work in some cases, but not all.

SilkyEvCam

Documentation on the SilkyEvCam pinout can be found here on page 6. This system uses 3.3V logic for both trigger in as well as trigger out.

While the loopback configuration is internal to the chip, the trigger out line will still pulse externally. This is useful if using an event camera to trigger an external system as you will maintain the timing relative to the internal clock (after association between the trigger event and the external system).

Other cameras

External triggering works on SilkyEVCam HD, i.e. you can trigger the SilkyEVCam HD externally the same way as the SilkyEVCam VGA.

License

This software is issued under the Apache License Version 2.0.