Home

Awesome

Implementation of L1, L2, ElasticNet, GroupLasso and GroupSparseRegularization

  1. Publication available here: [https://towardsdatascience.com/different-types-of-regularization-on-neuronal-network-with-pytorch-a9d6faf4793e]
  2. Implemented in pytorch. This is an attempt to provide different type of regularization of neuronal network weights in pytorch.
  3. The regularization can be applied to one set of weight or all the weights of the model

Metrics Scores table

RegularizationTest AccuracyBest HyperParameters
L198.3193'batch_size': 32, 'ld_reg': 1e-05, 'lr': 0.0001, 'n_epoch': 200
L299.1596'batch_size': 32, 'ld_reg': 1e-06, 'lr': 0.0001, 'n_epoch': 200
EL98.3193'alpha_reg': 0.9, 'batch_size': 32, 'ld_reg': 1e-05, 'lr': 0.001, 'n_epoch': 200
GL97.4789'batch_size': 32, 'ld_reg': 1e-07, 'lr': 0.0001, 'n_epoch': 200
SGL76.4705'batch_size': 128, 'ld_reg': 1e-06, 'lr': 1e-05, 'n_epoch': 200
FC90.7563'batch_size': 128, 'lr': 0.01, 'n_epoch': 200
FC with Weight decay99.1596'batch_size': 32, 'lr': 0.0001, 'n_epoch': 200, 'weight_decay': 0.01

Sparsity Percentage table

ModelLayer 1 (%)Layer 2 (%)Layer 3(%)
L160800
L262.550
EL858030
GL7.550
SGL92.58530
FC000
FC with Weight decay000