Awesome
End-to-end representation learning for Correlation Filter based tracking
Project page: [http://www.robots.ox.ac.uk/~luca/cfnet.html]
WARNING: we used Matlab 2015, MatConvNet v1.0beta24, CUDA 8.0 and cudnn 5.1. Other configurations might work, but it is not guaranteed. In particular, we received several reports of problems with Matlab 2017.
Getting started
[ Tracking only ] If you don't care about training, you can simply use one of our pretrained networks with our basic tracker.
- Prerequisites: GPU, CUDA (we used 7.5), cuDNN (we used v5.1), Matlab, MatConvNet.
- Clone the repository.
- Download the pretrained networks from here and unzip the archive in
cfnet/pretrained
. - Go to
cfnet/src/tracking/
and remove the trailing.example
fromenv_paths_tracking.m.example
,startup.m.example
, editing the files as appropriate. - Be sure to have at least one video sequence in the appropriate format. The easiest thing to do is to download the validation set (from here) that we used for the tracking evaluation and then extract the
validation
folder incfnet/data/
. - Start from one of the
cfnet/src/tracking/run_*_evaluation.m
entry points.
[ Training and tracking ] Start here if instead you prefer to DIY and train your own networks.
- Prerequisites: GPU, CUDA (we used 7.5), cuDNN (we used v5.1), Matlab, MatConvNet.
- Clone the repository.
- Follow these step-by-step instructions, which will help you generating a curated dataset compatible with the rest of the code.
- If you did not generate your own metadata, download imdb_video_2016-10.mat (6.7GB) with all the metadata and also the dataset stats. Put them in
cfnet/data/
. - Go to
cfnet/src/training
and remove the trailing.example
fromenv_paths_training.m.example
andstartup.m.example
, editing the files as appropriate. - The various
cfnet/train/run_experiment_*.m
are some examples to start training. Default hyper-params are at the start ofexperiment.m
and are overwritten by custom ones specified inrun_experiment_*.m
. - By default, training plots are saved in
cfnet/src/training/data/
. When you are happy, grab a network snapshot (net-epoch-X.mat
) and save it somewhere (e.g.cfnet/pretrained/
). - Go to point
4.
of <i>Tracking only</i>, follow the instructions and enjoy the labour of your own GPUs!