Home

Awesome

The Pytorch Implementation of L-Softmax

this repository contains a new, clean and enhanced pytorch implementation of L-Softmax proposed in the following paper:

Large-Margin Softmax Loss for Convolutional Neural Networks By Weiyang Liu, Yandong Wen, Zhiding Yu, Meng Yang [pdf in arxiv] [original CAFFE code by authors]

L-Softmax proposes a modified softmax classification method to increase the inter-class separability and intra-class compactness.

this re-implementation is based on the earlier pytorch implementation here by jihunchoi and borrowing some ideas from its TensorFlow implementation here by auroua. Generally the improvements are as follows:

Version compatibility

This code has been tested in Ubuntu 18.04 LTS using PyCharm IDE and a NVIDIA 1080Ti GPU. Here is a list of libraries and their corresponding versions:

python = 3.6
pytorch = 0.4.1
torchvision = 0.2.1
matplotlib = 2.2.2
numpy = 1.14.3
scipy = 1.1.0

Network parameters

Results

Here are the test set visualization results of training the MNIST for different margins: alt text

And here is the tabulated results of training MNIST with the proposed network in the paper:

margintest accuracypaper
m = 199.37%99.60%
m = 299.60%99.68%
m = 399.56%99.69%
m = 499.61%99.69%