Home

Awesome

Pretrained DLCHOMP Networks for Manipulator Motion Planning

This repository provides pretrained networks for Deep-Learning-Based Covariant Hamiltonian Optimization for Motion Planning (DLCHOMP) of robotic manipulators for MATLAB®. These pretrained networks can output intermediate trajectory guesses for desired start to goal configurations in a given spherical obstacle environment. Open in MATLAB Online

DLCHOMP High Level Visualization

Creator: MathWorks Development

Includes un-trained model: ❌

Includes transfer learning script: ✅
(See this page)

Supported Robots: ✅
(See this page)

Requirements

Getting Started

Download or clone this repository to your machine and open it in MATLAB®.

Setup

Add path to the source directory.

addpath("src");

Download Pretrained Network

Use the code below to download the pretrained network for a supported robot. For a list of supported robots, see the table in the Metrics and Evaluation section. To demonstrate this workflow, download the pretrained DLCHOMP network for the KUKA LBR iiwa 7.

robotName = "kukaIiwa7";
data = helper.downloadPretrainedDLCHOMPForRobot(robotName);

Summarize Pretrained Network

Extract the pretrained network. You can summarize and analyze the network as needed to better understand the architecture of the network.

pretrainedNetwork = data.trainedNetwork;
summary(pretrainedNetwork);
analyzeNetwork(pretrainedNetwork);

If you only need the pretrained network and not the pretrained optimizer, skip to the DLCHOMP Details section. Otherwise, continue to obtain a pretrained optimizer.

Obtain Pretrained DLCHOMP Optimizer

Extract the pretrained optimizer.

pretrainedDLCHOMP = data.trainedDLCHOMP;

Predict Trajectory Using Pretrained DLCHOMP Optimizer

Use the pretrained optimizer to predict a trajectory between a start joint configuration and goal joint configuration in an obstacle environment.

% Extract obstacle environment that was used to train the DLCHOMP optimizer.
pretrainedDLCHOMP.SphericalObstacles = data.unseenObstacles;

% Predict a trajectory using the optimizer.
[optimWpts,optimTpts,solinfo] = optimize(pretrainedDLCHOMP,data.unseenStart,data.unseenGoal);

% Visualize results.
show(pretrainedDLCHOMP,optimWpts);

DLCHOMP Output Prediction

Create and Train DLCHOMP Optimizer for New Applications

To generate data and train a DLCHOMP optimizer to suit your application or task, follow the Train Deep-Learning-Based CHOMP Optimizer for Motion Planning example.

Train Custom DLCHOMP Optimizer Using Transfer Learning

Transfer learning enables you to adapt a pretrained DLCHOMP optimizer to your dataset. Follow these examples to create a custom DLCHOMP optimizer and train it for transfer learning to:

DLCHOMP Details

You can enhance the efficiency of optimization-based motion planning tasks by applying deep learning [1]. dlCHOMP is one such MATLAB® feature that utilizes a neural network initial guesser to provide an educated initial guess for a robot's intermediate start to goal trajectory, which is then optimized using the Covariant Hamiltonian Optimization for Motion Planning (CHOMP)[2] algorithm.

DLCHOMP Overview

Neural Network Details

This figure shows the architecture of the DLCHOMP neural network [2].

DLCHOMP Network Architecture

It takes a given motion task (a world obstacle encoding vector WB, a start configuration q1 and an end configuration qNt) to output an initial guess Q. Blocks of tapered Fully Connected Layers (gray) are combined like the DenseNet architecture [3] via skip-connections and concatenations (circular nodes).

Metrics and Evaluation

The test dataset for each pretrained network consists of 1000 data samples, identical to the validation dataset created during the network training phase. To augment the data, these 1000 test samples were flipped, capitalizing on the symmetric nature of the motion planning problem, resulting in a total of 2000 test data samples. The results in these tables are for these 2000 test data sample sets that were created for each robot.

Size and Accuracy Metrics

<table> <tr> <th>Header</th> <th>Definition</th> </tr> <tr> <th>DLCHOMP Optimizer</th> <td>Name of the supported robot for whom the metrics are being listed in the current row. This name is a short-hand name used to quickly identify each robot. To obtain the full robot name, and hence determine the exact robot model, see the [Pretrained Optimizers](https://www.mathworks.com/help/releases/R2024a/robotics/ref/dlchomp.html#mw_93957c22-f6cc-4e8c-ac45-1ae16cfcb2ef) section of the dlCHOMP page. </td> </tr> <tr> <th>Size (MB)</th> <td>Memory footprint of the DLCHOMP optimizer object in megabytes.</td> </tr> <tr> <th>% of samples with DLCHOMP Itns < CHOMP</th> <td>Percentage of data samples where the dlCHOMP optimizer took lesser number of iterations than an equivalent manipulatorCHOMP optimizer with similar optimization options.</td> </tr> <tr> <th>Mean % of Itns Saved</th> <td>The mean percentage of iterations saved by the dlCHOMP optimizer for the data samples where the dlCHOMP optimizer took lesser iterations than the equivalent manipulatorCHOMP optimizer with similar optimization options.</td> </tr> <tr> <th>% of samples with DLCHOMP Inference Time < CHOMP</th> <td>Percentage of data samples where the dlCHOMP optimizer's optimization time was lesser than that of an equivalent manipulatorCHOMP optimizer with similar optimization options.</td> </tr> <tr> <th>Mean % of Inference Time Saved</th> <td>The mean percentage of inference time saved by the dlCHOMP optimizer for the data samples where the dlCHOMP optimizer took lesser iterations than that of an equivalent manipulatorCHOMP optimizer with similar optimization options. Inference time of a dlCHOMP optimizer is the sum of the network guess time and subsequent the optimization time. Inference time of a manipulatorCHOMP optimizer is the same as its optimization time since it does not have a neural network component.</td> </tr> <tr> <th>Feasibility</th> <td>The percentage of test data samples where the dlCHOMP optimizer gave a collision free optimized trajectory.</td> </table>

The table above defines the headers present in the table below:

DLCHOMP OptimizerSize (MB)% of samples with DLCHOMP Itns < CHOMPMean % of Itns Saved% of samples with DLCHOMP Inference Time < CHOMPMean % of Time SavedFeasibility
abbYuMi2584.7078.3678.2073.9574.50
fanucLRMate200ib2587.2080.4279.1072.3078.65
fanucM16ib2575.3082.1767.4076.0973.95
frankaEmikaPanda2584.4083.1077.6077.9377.25
kinovaJacoJ2S7S3002599.0078.6374.0075.0768.55
kinovaGen32577.9074.5663.7067.4472.15
kukaIiwa72583.8079.0274.9072.3880.40
meca500r32585.0079.2474.9071.4165.15
techmanTM5-7002578.4074.4967.4066.3871.20
universalUR5e2573.6076.4462.2070.2171.05

CPU Time Metrics

<table> <tr> <th>Header</th> <th>Definition</th> </tr> <th>DLCHOMP Model Without Codegen</th> <td>Name of the supported robot for whom the metrics are being listed in the current row. These metrics were computed in MATLAB without using its code generation feature. This name is a short-hand name used to quickly identify each robot. To obtain the full robot name, and hence determine the exact robot model, [Pretrained Optimizers](https://www.mathworks.com/help/releases/R2024a/robotics/ref/dlchomp.html#mw_93957c22-f6cc-4e8c-ac45-1ae16cfcb2ef). </td> <tr> <th>Mean Network Guess Time (secs)</th> <td>The mean time taken by the dlCHOMP optimizer to obtain its neural network's intermediate guess trajectory in seconds.</td> </tr> <tr> <th>Mean Inference Time (secs)</th> <td>The mean of the total time taken by the dlCHOMP optimizer to obtain its neural network's intermediate guess trajectory and then optimize it using CHOMP, in seconds.</td> </tr> </table>

The table above defines the headers present in the table below:

DLCHOMP Model without CodegenMean Network Guess Time (secs)Mean Inference Time (secs)
abbYuMi0.007219.2465
fanucLRMate200ib0.01000.6899
fanucM16ib0.00691.2242
frankaEmikaPanda0.00751.6912
kinovaJacoJ2S7S3000.00984.4300
kinovaGen30.00723.0774
kukaIiwa70.00601.5289
meca500r30.00570.5911
techmanTM5-7000.00521.2719
universalUR5e0.00751.6614

Note: Dual-arm YuMi® (abbYuMi) being the only two-armed robot in this list, takes much longer for optimization as compared to other robots due to the higher probability of self-collisions.

References

[1] J. Tenhumberg, D. Burschka and B. Bäuml, "Speeding Up Optimization-based Motion Planning through Deep Learning," 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 2022, pp. 7182-7189, doi: 10.1109/IROS47612.2022.9981717.

[2] N. Ratliff, M. Zucker, J. A. Bagnell and S. Srinivasa, "CHOMP: Gradient optimization techniques for efficient motion planning," 2009 IEEE International Conference on Robotics and Automation, 2009, pp. 489-494, doi: 10.1109/ROBOT.2009.5152817.

[3] S. J´egou et al., “The one hundred layers tiramisu: Fully convolutional densenets for semantic segmentation,” CoRR, vol. abs/1611.09326, 2016

Copyright 2024 The MathWorks, Inc.