Home

Awesome

The Gelslight is the PRL's vision-based tactile sensor designed to replace the ATI Nano-25 F/T sensor used for sensorized feeding on ADA. Inspired by MIT's GelSight sensor, the Gelslight achieves the smallest form factor of its class of sensors by using a miniature wide-angle camera, clever design, and precision manufacturing equipment.

Marker Tracking Algorithm

Dependencies

Installation

Clone into the src folder of your catkin workspace.

$ git clone https://github.com/personalrobotics/gelslight_tracking.git
cd gelslight_tracking

Ensure that the python version referenced in the makefile gelslight_tracking/makefile matches your version of Python. Then remove find_marker.so and recompile.

rm -rf src/find_marker.so
make

Sensor Configuration

In the first iteration of the marker tracking algorithm, the detected markers are matched to an initialized grid of markers. In order for this to work properly, the initialized grid of markers must be close to the corresponding detected markers. This means there are two potential modes of error: not detecting all of the markers, and error in the initial positions of the markers. Careful tuning of these settings will ensure the success of the marker tracking algorithm.

Step 1: Marker detection

The marker detection is implemented in src/marker_detection.py which uses Gaussian filtering and a min-max threshold in the HSV-space to detect the yellow markers.

Modify the parameters using the test_find_marker.py script which takes the image rescale value as an argument. Two images will appear: one is the raw image from the sensor, and the other is the filtered image. Slide the adjuster bars until white dots corresponding to the markers appear on a black background.

python3 src/test_find_marker.py 3

Parameters:

Step 2: Initial Marker Positions

The initial marker positions are implemented as an N x M array of points with constant spacing in the x and y direction, dx, dy. The position of the points is defined by the top-left dot of the array, x0, y0. These values are located in

src/setting.py

There is not proper strain relief on the cable of the camera in the sensor, which causes the camera to shift whenever they are removed from Ada. This affects the initial marker positions and will require tuning. Expect to spend time with this tuning.

Modify the values using the test_settings.py script which takes the rescale value as an argument. Initially, with non-zero dx and dy values, there will be a matrix of green arrows pointing from the red detected markers to the initial marker positions. Adjust the sliders until the arrows turn into small green spots inside the red markers. Modifying the slider values once the arrows have "snapped" into the dots will not make any considerable difference. Further adjustment should be done by re-running the script.

python3 src/test_settings.py 3

Parameters:

Running Dot Tracking

Base version

The dot tracking script tracking.py takes two argument: sensor id and rescale value. These inputs are used to select the dot tracking configuration in the setting.py file.

Launch roscore.

roscore

In another terminal, run the script

cd ~/<path_to_ws>/src/gelslight_tracking
python3 src/tracking.py 1 3

which takes the sensor identifier flag and the image rescale value as arguments, respectively.

Running on ADA

Turn on ADA. SSH into the Nano with x forwarding and set weebo as the Ros Master

ssh -X nano
useweebo

Either run the base tracking script or the tracking with taring action script (if running the feeding demo without the ATI F/T sensor). Note that Nano uses python2 rather than python3.

cd ~/catkin_ws/src/gelslight_tracking
python2 src/tracking_w_taring_action.py 1 3

Note: The USB ports on the Nano which the sensors are plugged into affect the ordering of the camera ports when powering on ADA.

<img src="hardware/pictures/nano_port_id.PNG" width="400"/>

Output

Tracking

The tracking algorithms will display the camera feed and print the force and torque in the Z-direction. The gelsight node initialized by these scripts publishes the camera feed and a 6-DOF Wrench (2 implemented).

Force-Torque Calibration

In its current state, the output of the sensor is the average displacement and curl in the z-axis. Calibration must be performed to map these outputs to force. A relevant paper on calibration of other tactile sensors on ADA may be found here.

The general idea is to attach the Gelslight sensor onto ADA and grip onto a fork handle with the industrial force-torque (FT) sensor. Have the robot push the fork into an object and record the readings from the tactile sensor and FT sensor, then calculate the mapping from the tactile sensor readings to the readings of the FT sensor.

Common Issues

To Do