Awesome
Sunnet is a light deep learning library.
Linux/Windows | License |
---|---|
Compare with Tensorflow, inference ResNet50. PC: i5-2400, GF1050, Win7, MSVC12.
CPU: time/img, ms | GPU: time/img, ms | CPU: RAM, Mb | GPU: RAM, Mb | |
---|---|---|---|---|
Sunnet | 195 | 15 | 600 | 800 |
Tensorflow | 250 | 25 | 400 | 1400 |
Features
-
the library is written from scratch in C++ (only STL + OpenBLAS for calculation), C-interface
-
win / linux;
-
network structure is set in JSON;
-
base layers: fully connected, convolutional, pooling. Additional: resize, crop..;
-
basic chips: batchNorm, dropout, weight optimizers - adam, adagrad..;
-
for calculation on the CPU, OpenBLAS is used, for the video card - CUDA / cuDNN;
-
interfaces for C++, C# and Python.
Python example
# create net
net = snNet.Net()
net.addNode('In', snOperator.Input(), 'C1') \
.addNode('C1', snOperator.Convolution(15), 'C2') \
.addNode('C2', snOperator.Convolution(25), 'P1') \
.addNode('P1', snOperator.Pooling(), 'F1') \
.addNode('F1', snOperator.FullyConnected(256), 'F2') \
.addNode('F2', snOperator.FullyConnected(10), 'LS') \
.addNode('LS', snOperator.LossFunction(snType.lossType.softMaxToCrossEntropy), 'Output')
.............
# cycle lern
for n in range(1000):
acc = [0]
net.training(lr, inLayer, outLayer, targLayer, acc)
# calc accurate
acc[0] = 0
for i in range(bsz):
if (np.argmax(outLayer[i][0][0]) == np.argmax(targLayer[i][0][0])):
acc[0] += 1
accuratSumm += acc[0]/bsz
print(datetime.datetime.now().strftime('%H:%M:%S'), n, "accurate", accuratSumm / (n + 1))
Wiki
Examples
License
Licensed under an [MIT-2.0]-license.