Awesome
This repo is no longer maintained!
DICTOL - A Discriminative dictionary Learning Toolbox for Classification (MATLAB version).
This Toolbox is a part of our LRSDL project.
Related publications:
-
Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." to appear in IEEE Transactions on Image Processing. [paper].
-
Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper].
Author: Tiep Vu
Run DICTOL_demo.m
to see example
If you experience any bugs, please let me know via the Issues tab. I'd really appreciate and fix the error ASAP. Thank you.
On this page:
<!-- MarkdownTOC -->- Notation
- Sparse Representation-based classification (SRC)
- Online Dictionary Learning (ODL)
- LCKSVD
- Dictionary learning with structured incoherence and shared features (DLSI)
- Dictionary learning for separating the particularity and the commonality (COPAR)
- LRSDL
- Fisher discrimination dictionary learning (FDDL)
- Discriminative Feature-Oriented dictionary learning (DFDL)
- D2L2R2
- Fast iterative shrinkage-thresholding algorithm (FISTA)
- References
<a name="notation"></a>
Notation
Y
: signals. Each column is one observation.D
: dictionary.X
: sparse coefficient.d
: signal dimension.d = size(Y, 1)
.C
: number of classes.c
: class index.n_c
: number of training samples in classc
. Typically, alln_c
are the same and equal ton
.N
: total number of training samples.Y_range
: an array storing range of each class, suppose that labels are sorted in a ascending order. Example: IfY_range = [0, 10, 25]
, then:- There are two classes, samples from class 1 range from 1 to 10, from class 2 range from 11 to 25.
- In general, samples from class
c
range fromY_range(c) + 1
toY_range(c+1)
- We can observe that number of classes
C = numel(Y_range) - 1
.
k_c
: number of bases in class-specific dictionaryc
. Typically, alln_c
are the same and equal tok
.k_0
: number of bases in the shared-dictionaryK
: total number of dictionary bases.D_range
: similar toY_range
but used for dictionary without the shared dictionary.
<a name="sparse-representation-based-classification-src"></a>
Sparse Representation-based classification (SRC)
- Sparse Representation-based classification implementation [1].
- Classification based on SRC.
- Syntax:
[pred, X] = SRC_pred(Y, D, D_range, opts)
- INPUT:
Y
: test samples.D
: the total dictionary.D = [D_1, D_2, ..., D_C]
withD_c
being the c-th class-specific dictionary.D_range
: range of class-specific dictionaries inD
. See also Notation.opts
: options.opts.lambda
:lambda
for the Lasso problem. Default:0.01
.opts.max_iter
: maximum iterations of fista algorithm. Default:100
. Check this implementation of FISTA
- OUTPUT:
pred
: predicted labels of test samples.X
: solution of the lasso problem.
- INPUT:
<a name="online-dictionary-learning-odl"></a>
Online Dictionary Learning (ODL)
- An implementation of the well-known Online Dictionary Learning method [2].
<a name="cost-function"></a>
Cost function
<img src = "latex/ODL_cost.png" height = "40"/><a name="training-odl"></a>
Training ODL
- Syntax:
[D, X] = ODL(Y, k, lambda, opts, sc_method)
- INPUT:
- OUTPUT:
D, X
: as in the problem.
<a name="lcksvd"></a>
LCKSVD
Check its project page <a name="dictionary-learning-with-structured-incoherence-and-shared-features-dlsi"></a>
Dictionary learning with structured incoherence and shared features (DLSI)
- An implementation of the well-known DLSI method [5].
<a name="cost-function-1"></a>
Cost function
<img src = "latex/DLSI_cost.png" height = "50"/><a name="training-dlsi"></a>
Training DLSI
- function
[D, X, rt] = DLSI(Y, Y_range, opts)
- The main DLSI algorithm
- INPUT:
Y, Y_range
: training samples and their labelsopts
:opts.lambda, opts.eta
:lambda
andeta
in the cost functionopts.max_iter
: maximum iterations.
- OUTPUT:
D
: the trained dictionary,X
: the trained sparse coefficient,rt
: total running time of the training process.
<a name="dlsi-predict-new-samples"></a>
DLSI predict new samples
- function
pred = DLSI_pred(Y, D, opts)
- predict the label of new input
Y
given the trained dictionaryD
and parameters stored inopts
<a name="demo"></a>
Demo
Run DLSI_top
in Matlab command window.
<a name="dictionary-learning-for-separating-the-particularity-and-the-commonality-copar"></a>
Dictionary learning for separating the particularity and the commonality (COPAR)
- An implementation of COPAR [7].
<a name="cost-function-2"></a>
Cost function
<img src = "latex/COPAR_cost.png" height = "50"/>where:
<img src = "latex/COPAR_cost1.png" height = "50"/> <a name="training-copar"></a>Training COPAR
-
function
[D, X, rt] = COPAR(Y, Y_range, opts)
-
INPUT:
Y, Y_range
: training samples and their labelsopts
: a structopts.lambda, opts.eta
:lambda
andeta
in the cost functionopts.max_iter
: maximum iterations.
-
OUTPUT:
D
: the trained dictionary,X
: the trained sparse coefficient,rt
: total running time of the training process.
<a name="copar-predect-new-samples"></a>
COPAR predect new samples
-
function pred = COPAR_pred(Y, D, D_range_ext, opts)
-
predict label of the input Y
-
INPUT:
Y
: test samplesD
: the trained dictionaryD_range_ext
: range of class-specific and shared dictionaries inD
. The shared dictionary is located at the end ofD
.opts
: a struct of options:
opts.classify_mode
: a string of classification mode. either'GC'
(global coding) or'LC'
(local coding)opts.lambda, opts.eta, opts.max_iter
: as inCOPAR.m
.
-
OUTPUT:
pred
: predicted labels ofY
.
<a name="demo-1"></a>
Demo
Run COPAR_top
in the Matlab command window.
<a name="lrsdl"></a>
LRSDL
- An implementation of COPAR [8].
<a name="motivation"></a>
Motivation
<a name="cost-function-3"></a>
Cost function
Note that unlike COPAR, in LRSDL, we separate the class-specific dictionaries (D
) and the shared dictionary (D_0
). The sparse coefficients (X
, X^0
) are also separated.
<a name="training-lrsdl"></a>
Training LRSDL
-
function `[D, D0, X, X0, CoefM, coefM0, opts, rt] = LRSDL(Y, train_label, opts)``
-
INPUT:
Y, Y_range
: training samples and their labelsopts
: a structopts.lambda1, opts.lambda
:lambda1
andlambda2
in the cost function,opts.lambda3
:eta
in the cost function (fix later),opts.max_iter
: maximum iterations,opts.D_range
: range of the trained dictionary,opts.k0
: size of the shared dictionary
-
OUTPUT:
D, D0, X, X0
: trained matrices as in the cost function,CoefM
: the mean matrix.CoefM(:, c)
is the mean vector ofX_c
(mean of columns).CoefM0
: the mean vector ofX0
,rt
: total running time (in seconds).
<a name="lrsdl-predict-new-samples"></a>
LRSDL predict new samples
See LRSDL_pred_GC.m
function
<a name="demo-2"></a>
Demo
Run LRSDL_top
in the Matlab command window.
<a name="fisher-discrimination-dictionary-learning-fddl"></a>
Fisher discrimination dictionary learning (FDDL)
- An implementation of FDDL [4].
<a name="cost-function-4"></a>
Cost function
Simiar to LRSDL cost function without red terms.
<a name="training-fddl"></a>
Training FDDL
Set opts.k0 = 0
and using LRSDL.m
function.
<a name="fddl-predect-new-samples"></a>
FDDL predect new samples
- function `pred = FDDL_pred(Y, D, CoefM, opts)``
<a name="discriminative-feature-oriented-dictionary-learning-dfdl"></a>
Discriminative Feature-Oriented dictionary learning (DFDL)
- Its project page.
<a name="dlr"></a>
D2L2R2
- Update later
<a name="fast-iterative-shrinkage-thresholding-algorithm-fista"></a>
Fast iterative shrinkage-thresholding algorithm (FISTA)
- An implementation of FISTA [10].
- Check this repository
<a name="references"></a>
References
<a name="fn_src">[1]</a>. (SRC)Wright, John, et al. "Robust face recognition via sparse representation." Pattern Analysis and Machine Intelligence, IEEE Transactions on 31.2 (2009): 210-227. paper
<a name="fn_odl">[2]</a>. (ODL) Mairal, Julien, et al. "Online learning for matrix factorization and sparse coding." The Journal of Machine Learning Research 11 (2010): 19-60. [paper]
<a name="fn_lck">[3]</a>. (LC-KSVD) Jiang, Zhuolin, Zhe Lin, and Larry S. Davis. "Label consistent K-SVD: Learning a discriminative dictionary for recognition." Pattern Analysis and Machine Intelligence, IEEE Transactions on 35.11 (2013): 2651-2664. [Project page]
<a name="fn_fdd">[4]</a>. (FDDL) Yang, Meng, et al. "Fisher discrimination dictionary learning for sparse representation." Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011. [paper], [code]
<a name="fn_dls">[5]</a>. (DLSI)Ramirez, Ignacio, Pablo Sprechmann, and Guillermo Sapiro. "Classification and clustering via dictionary learning with structured incoherence and shared features." Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. IEEE, 2010. [paper]
<a name="fn_dfd">[6]</a>. (DFDL) Discriminative Feature-Oriented dictionary Learning. Tiep H. Vu, H. S. Mousavi, V. Monga, A. U. Rao and G. Rao, "Histopathological Image Classification using Discriminative Feature-Oriented dictionary Learning", IEEE Transactions on Medical Imaging , volume 35, issue 3, pages 738-751, March 2016. [paper] [Project page]
<a name="fn_cor">[7]</a>. (COPAR) Kong, Shu, and Donghui Wang. "A dictionary learning approach for classification: separating the particularity and the commonality." Computer Vision ECCV 2012. Springer Berlin Heidelberg, 2012. 186-199. [paper]
<a name="fn_lrs">[8]</a>. (LRSDL) Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper]
<a name="fn_shr">[9]</a>. A singular value thresholding algorithm for matrix completion." SIAM Journal on Optimization 20.4 (2010): 1956-1982. [paper]
<a name="fn_fista">[10]</a>. (FISTA) Beck, Amir, and Marc Teboulle. "A fast iterative shrinkage-thresholding algorithm for linear inverse problems." SIAM journal on imaging sciences 2.1 (2009): 183-202. [paper]
<a name="fn_spams"> [11]</a>. (SPAMS) The Sparse Modeling Software