Awesome
DISCONTINUATION OF PROJECT.
This project will no longer be maintained by Intel.
Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project.
Intel no longer accepts patches to this project.
If you have an ongoing need to use this project, are interested in independently developing it, or would like to maintain patches for the open source software community, please create your own fork of this project.
ros_object_analytics
Object Analytics (OA) is ROS wrapper for realtime object detection, localization and tracking. These packages aim to provide real-time object analyses over RGB-D camera inputs, enabling ROS developer to easily create amazing robotics advanced features, like intelligent collision avoidance and semantic SLAM. It consumes sensor_msgs::PointClould2 data delivered by RGB-D camera, publishing topics on object detection, object tracking, and object localization in 3D camera coordination system.
OA keeps integrating with various "state-of-the-art" algorithms.
- Object detection offload to GPU, ros_opencl_caffe, with Yolo v2 model and OpenCL Caffe framework
- Object detection offload to VPU, ros_intel_movidius_ncs (devel branch), with MobileNet SSD model and Caffe framework
compiling dependencies
ROS packages from ros-kinetic-desktop-full
- roscpp
- nodelet
- std_msgs
- sensor_msgs
- geometry_msgs
- dynamic_reconfigure
- pcl_conversions
- cv_bridge
- libpcl-all
- libpcl-all-dev
- ros-kinetic-opencv3
Other ROS packages
NOTE: OA depends on tracking feature from OpenCV (3.3 preferred, 3.2 minimum). The tracking feature is recently provided by ROS Kinetic package "ros-kinetic-opencv3" (where OpenCV 3.3.1 is integrated). However, if you're using an old version of ROS Kinetic (where OpenCV 3.2 is integrated), tracking feature is not provided. In such case you need self-build tracking from opencv_contrib. It is important to keep opencv_contrib (self-built) and opencv (ROS Kinetic provided) in the same OpenCV version that can be checked from "/opt/ros/kinetic/share/opencv3/package.xml"
build and test
- to build
cd ${ros_ws} # "ros_ws" is the catkin workspace root directory where this project is placed in
catkin_make
- to test
catkin_make run_tests
- to install
catkin_make install
extra running dependencies
RGB-D camera
- librealsense2 tag v2.9.1 and realsense_ros_camera tag 2.0.2 if run with Intel RealSense D400
roslaunch realsense_ros_camera rs_rgbd.launch
- openni_launch or freenect_launch and their dependencies if run with Microsoft XBOX 360 Kinect
roslaunch openni_launch openni.launch
- ros_astra_camera if run with Astra Camera
roslaunch astra_launch astra.launch
command to launch object_analytics
-
launch with OpenCL caffe as detection backend
roslaunch object_analytics_launch analytics_opencl_caffe.launch
-
launch with Movidius NCS as detection backend
roslaunch object_analytics_launch analytics_movidius_ncs.launch
Frequently used options
- input_points Specify arg "input_points" for the name of the topic publishing the sensor_msgs::PointCloud2 messages by RGB-D camera. Default is "/camera/depth_registered/points" (topic compliant with ROS OpenNI launch)
- aging_th Specifiy tracking aging threshold, number of frames since last detection to deactivate the tracking. Default is 16.
- probability_th Specify the probability threshold for tracking object. Default is "0.5".
roslaunch object_analytics_launch analytics_movidius_ncs.launch aging_th:=30 probability_th:="0.3"
published topics
object_analytics/rgb (sensor_msgs::Image)
object_analytics/pointcloud (sensor_msgs::PointCloud2)
object_analytics/localization (object_analytics_msgs::ObjectsInBoxes3D)
object_analytics/tracking (object_analytics_msgs::TrackedObjects)
object_analytics/detection (object_msgs::ObjectsInBoxes)
KPI of differnt detection backends
<table> <tr> <td></td> <td>topic</td> <td>fps</td> <td>latency <sup>sec</sup></td> </tr> <tr> <td rowspan='4'>OpenCL Caffe</td> </tr> <tr> <td>localization</td> <td>6.63</td> <td>0.23</td> </tr> <tr> <td>detection</td> <td>8.88</td> <td>0.17</td> </tr> <tr> <td>tracking</td> <td>12.15</td> <td>0.33</td> </tr> <tr> <td rowspan='4'>Movidius NCS</sup></td> </tr> <tr> <td>localization</td> <td>7.44</td> <td>0.21</td> </tr> <tr> <td>detection</td> <td>10.5</td> <td>0.15</td> </tr> <tr> <td>tracking</td> <td>13.85</td> <td>0.24</td> </tr> </table>- CNN model of Movidius NCS is MobileNet
- Hardware: Intel(R) Xeon(R) CPU E3-1275 v5 @3.60GHz, 32GB RAM, Intel(R) RealSense R45
visualize tracking and localization results on RViz
Steps to enable visualization on RViz are as following
roslaunch object_analytics_visualization rviz.launch