Home

Awesome

MicroMLP is a micro artificial neural network multilayer perceptron (principally used on ESP32 and Pycom modules)

HC²

Very easy to integrate and very light with one file only :

MicroMLP features :

Use deep learning for :

<p align="center"> <img src="hc2-deep-learning.png"> </p>

Using MicroMLP static functions :

NameFunction
Createmlp = MicroMLP.Create(neuronsByLayers, activationFuncName, layersAutoConnectFunction=None, useBiasValue=1.0)
LoadFromFilemlp = MicroMLP.LoadFromFile(filename)

Using MicroMLP speedly creation of a neural network :

from microMLP import MicroMLP
mlp = MicroMLP.Create([3, 10, 2], "Sigmoid", MicroMLP.LayersFullConnect)

Using MicroMLP main class :

NameFunction
Constructormlp = MicroMLP()
GetLayerlayer = mlp.GetLayer(layerIndex)
GetLayerIndexidx = mlp.GetLayerIndex(layer)
RemoveLayermlp.RemoveLayer(layer)
GetInputLayerinputLayer = mlp.GetInputLayer()
GetOutputLayeroutputLayer = mlp.GetOutputLayer()
Learnok = mlp.Learn(inputVectorNNValues, targetVectorNNValues)
Testok = mlp.Test(inputVectorNNValues, targetVectorNNValues)
PredictoutputVectorNNValues = mlp.Predict(inputVectorNNValues)
QLearningLearnForChosenActionok = mlp.QLearningLearnForChosenAction(stateVectorNNValues, rewardNNValue, pastStateVectorNNValues, chosenActionIndex, terminalState=True, discountFactorNNValue=None)
QLearningPredictBestActionIndexbestActionIndex = mlp.QLearningPredictBestActionIndex(stateVectorNNValues)
SaveToFileok = mlp.SaveToFile(filename)
AddExampleok = mlp.AddExample(inputVectorNNValues, targetVectorNNValues)
ClearExamplesmlp.ClearExamples()
LearnExampleslearnCount = mlp.LearnExamples(maxSeconds=30, maxCount=None, stopWhenLearned=True, printMAEAverage=True)
PropertyExampleRead/Write
Layersmlp.Layersget
LayersCountmlp.LayersCountget
IsNetworkCompletemlp.IsNetworkCompleteget
MSEmlp.MSEget
MAEmlp.MAEget
MSEPercentmlp.MSEPercentget
MAEPercentmlp.MAEPercentget
ExamplesCountmlp.ExamplesCountget

Using MicroMLP to learn the XOr problem (with hyperbolic tangent) :

from microMLP import MicroMLP

mlp = MicroMLP.Create( neuronsByLayers           = [2, 2, 1],
                       activationFuncName        = MicroMLP.ACTFUNC_TANH,
                       layersAutoConnectFunction = MicroMLP.LayersFullConnect )

nnFalse  = MicroMLP.NNValue.FromBool(False)
nnTrue   = MicroMLP.NNValue.FromBool(True)

mlp.AddExample( [nnFalse, nnFalse], [nnFalse] )
mlp.AddExample( [nnFalse, nnTrue ], [nnTrue ] )
mlp.AddExample( [nnTrue , nnTrue ], [nnFalse] )
mlp.AddExample( [nnTrue , nnFalse], [nnTrue ] )

learnCount = mlp.LearnExamples()

print( "LEARNED :" )
print( "  - False xor False = %s" % mlp.Predict([nnFalse, nnFalse])[0].AsBool )
print( "  - False xor True  = %s" % mlp.Predict([nnFalse, nnTrue] )[0].AsBool )
print( "  - True  xor True  = %s" % mlp.Predict([nnTrue , nnTrue] )[0].AsBool )
print( "  - True  xor False = %s" % mlp.Predict([nnTrue , nnFalse])[0].AsBool )

if mlp.SaveToFile("mlp.json") :
	print( "MicroMLP structure saved!" )
VariableDescriptionDefault
mlp.EtaWeighting of the error correction0.30
mlp.AlphaStrength of connections plasticity0.75
mlp.GainNetwork learning gain0.99
mlp.CorrectLearnedMAEThreshold of self-learning error0.02
GrapheActivation function nameConstDetail
HC²"Heaviside"MicroMLP.ACTFUNC_HEAVISIDEHeaviside binary step
HC²"Sigmoid"MicroMLP.ACTFUNC_SIGMOIDLogistic (sigmoid or soft step)
HC²"TanH"MicroMLP.ACTFUNC_TANHHyperbolic tangent
HC²"SoftPlus"MicroMLP.ACTFUNC_SOFTPLUSSoftPlus rectifier
HC²"ReLU"MicroMLP.ACTFUNC_RELURectified linear unit
HC²"Gaussian"MicroMLP.ACTFUNC_GAUSSIANGaussian function
Layers auto-connect functionDetail
MicroMLP.LayersFullConnectNetwork fully connected

Using MicroMLP.Layer class :

NameFunction
Constructorlayer = MicroMLP.Layer(parentMicroMLP, activationFuncName=None, neuronsCount=0)
GetLayerIndexidx = layer.GetLayerIndex()
GetNeuronneuron = layer.GetNeuron(neuronIndex)
GetNeuronIndexidx = layer.GetNeuronIndex(neuron)
AddNeuronlayer.AddNeuron(neuron)
RemoveNeuronlayer.RemoveNeuron(neuron)
GetMeanSquareErrormse = layer.GetMeanSquareError()
GetMeanAbsoluteErrormae = layer.GetMeanAbsoluteError()
GetMeanSquareErrorAsPercentmseP = layer.GetMeanSquareErrorAsPercent()
GetMeanAbsoluteErrorAsPercentmaeP = layer.GetMeanAbsoluteErrorAsPercent()
Removelayer.Remove()
PropertyExampleRead/Write
ParentMicroMLPlayer.ParentMicroMLPget
ActivationFuncNamelayer.ActivationFuncNameget
Neuronslayer.Neuronsget
NeuronsCountlayer.NeuronsCountget

Using MicroMLP.InputLayer(Layer) class :

NameFunction
ConstructorinputLayer = MicroMLP.InputLayer(parentMicroMLP, neuronsCount=0)
SetInputVectorNNValuesok = inputLayer.SetInputVectorNNValues(inputVectorNNValues)

Using MicroMLP.OutputLayer(Layer) class :

NameFunction
ConstructoroutputLayer = MicroMLP.OutputLayer(parentMicroMLP, activationFuncName, neuronsCount=0)
GetOutputVectorNNValuesoutputVectorNNValues = outputLayer.GetOutputVectorNNValues()
ComputeTargetLayerErrorok = outputLayer.ComputeTargetLayerError(targetVectorNNValues)

Using MicroMLP.Neuron class :

NameFunction
Constructorneuron = MicroMLP.Neuron(parentLayer)
GetNeuronIndexidx = neuron.GetNeuronIndex()
GetInputConnectionsconnections = neuron.GetInputConnections()
GetOutputConnectionsconnections = neuron.GetOutputConnections()
AddInputConnectionneuron.AddInputConnection(connection)
AddOutputConnectionneuron.AddOutputConnection(connection)
RemoveInputConnectionneuron.RemoveInputConnection(connection)
RemoveOutputConnectionneuron.RemoveOutputConnection(connection)
SetBiasneuron.SetBias(bias)
GetBiasneuron.GetBias()
SetOutputNNValueneuron.SetOutputNNValue(nnvalue)
ComputeValueneuron.ComputeValue()
ComputeErrorneuron.ComputeError(targetNNValue=None)
Removeneuron.Remove()
PropertyExampleRead/Write
ParentLayerneuron.ParentLayerget
ComputedOutputneuron.ComputedOutputget
ComputedDeltaErrorneuron.ComputedDeltaErrorget
ComputedSignalErrorneuron.ComputedSignalErrorget

Using MicroMLP.Connection class :

NameFunction
Constructorconnection = MicroMLP.Connection(neuronSrc, neuronDst, weight=None)
UpdateWeightconnection.UpdateWeight(eta, alpha)
Removeconnection.Remove()
PropertyExampleRead/Write
NeuronSrcconnection.NeuronSrcget
NeuronDstconnection.NeuronDstget
Weightconnection.Weightget

Using MicroMLP.Bias class :

NameFunction
Constructorbias = MicroMLP.Bias(neuronDst, value=1.0, weight=None)
UpdateWeightbias.UpdateWeight(eta, alpha)
Removebias.Remove()
PropertyExampleRead/Write
NeuronDstbias.NeuronDstget
Valuebias.Valueget
Weightbias.Weightget

Using MicroMLP.NNValue static functions :

NameFunction
FromPercentnnvalue = MicroMLP.NNValue.FromPercent(value)
NewPercentnnvalue = MicroMLP.NNValue.NewPercent()
FromBytennvalue = MicroMLP.NNValue.FromByte(value)
NewBytennvalue = MicroMLP.NNValue.NewByte()
FromBoolnnvalue = MicroMLP.NNValue.FromBool(value)
NewBoolnnvalue = MicroMLP.NNValue.NewBool()
FromAnalogSignalnnvalue = MicroMLP.NNValue.FromAnalogSignal(value)
NewAnalogSignalnnvalue = MicroMLP.NNValue.NewAnalogSignal()

Using MicroMLP.NNValue class :

NameFunction
Constructornnvalue = MicroMLP.NNValue(minValue, maxValue, value)
PropertyExampleRead/Write
AsFloatnnvalue.AsFloat = 639.513get / set
AsIntnnvalue.AsInt = 12345get / set
AsPercentnnvalue.AsPercent = 65get / set
AsBytennvalue.AsByte = b'\x75'get / set
AsBoolnnvalue.AsBool = Trueget / set
AsAnalogSignalnnvalue.AsAnalogSignal = 0.39472get / set

By JC`zic for HC² ;')

Keep it simple, stupid :+1: