Home

Awesome

SENet-Tensorflow

Simple Tensorflow implementation of Squeeze Excitation Networks using Cifar10

I implemented the following SENet

If you want to see the original author's code, please refer to this link

Requirements

Issue

Image_size

input_x = tf.pad(input_x, [[0, 0], [32, 32], [32, 32], [0, 0]]) # size 32x32 -> 96x96

NOT ENOUGH GPU Memory

with tf.Session() as sess : NO
with tf.Session(config=tf.ConfigProto(allow_soft_placement=True)) as sess : OK

Idea

What is the "SE block" ?

senet

def Squeeze_excitation_layer(self, input_x, out_dim, ratio, layer_name):
    with tf.name_scope(layer_name) :
        squeeze = Global_Average_Pooling(input_x)

        excitation = Fully_connected(squeeze, units=out_dim / ratio, layer_name=layer_name+'_fully_connected1')
        excitation = Relu(excitation)
        excitation = Fully_connected(excitation, units=out_dim, layer_name=layer_name+'_fully_connected2')
        excitation = Sigmoid(excitation)

        excitation = tf.reshape(excitation, [-1,1,1,out_dim])

        scale = input_x * excitation

        return scale

How apply ? (Inception, Residual)

<div align="center">  <img src="https://github.com/hujie-frank/SENet/blob/master/figures/SE-Inception-module.jpg" width="420"> <img src="https://github.com/hujie-frank/SENet/blob/master/figures/SE-ResNet-module.jpg" width="420"> </div>

How "Reduction ratio" should I set?

reduction

ImageNet Results

Benefits against Network Depth

depth

Incorporation with Modern Architecture

incorporation

Comparison with State-of-the-art

compare

Cifar10 Results

Will be soon

Related works

Reference

Author

Junho Kim