Awesome
Notes:
Still work in progress. This is heavily modified version of KerasSharp repo, for use of Unity with TensorflowSharp. A lot of stuff is removed/added for Unity.
It is made for Aalto University's Computational Intelligence in Games course and this repo: https://github.com/tcmxx/UnityTensorflowKeras.
Tested Unity Version: 2018.1.6f1
Installation:
- Copy the whole repo to Assets folder of Unity project.
- Import the TensorflowSharp plugin assets:
- from Unity ML's repo: https://github.com/Unity-Technologies/ml-agents/blob/master/docs/Using-TensorFlow-Sharp-in-Unity.md
- or, the one from me: https://1drv.ms/f/s!AoCjn1GKLxkNtNYuC2XaKoXX1aJbxQ
Note that the TensorflowSharp plugins provided by Unity is not updated yet. It's Android build is not supported,there are some bugs in Conv2D gradient, and it does not support Gradient for Concat operation. The one I provided fixed those problems for Windows platform, but not others(I don't have Mac or Linux machines)
The one provided by me has a GPU supported option. It requires Windows machine with CUDA(v9.0 recommended) and cuDNN(v7.0 recommended) installed.
Plaforms:
Windows is almost fully supported. If you want to use GPU, CUDA and cuDNN are needed(See above). Mac should be fully supported if I have a Mac to build, but now it does not have Concat Gradient. Mac does not support GPU. Linux is not tested at all.
Android does not support any type of gradient/training. IOS is not tested a all.
Instruction:
Not available yet.
Below is the readme from the original KerasSharp Repo:
Keras Sharp
An ongoing effort to port most of the Keras deep learning library to C#.
Welcome to the Keras# project! We aim to bring a experience-compatible Keras-like API to C#, meaning that, if you already know Keras, you should not have to learn any new concepts to get up and running with Keras#. This is a direct, line-by-line port of the Keras project, meaning that all updates and fixes currently sent to the main Keras project should be simple and straightforward to be applied to this branch. As in the original project, we aim to support both TensorFlow and CNTK - but not Theano, as it has been recently discontinued in 2017.
Example
Consider the following Keras Python example originally done by Jason Brownlee, reproduced below:
from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
numpy.random.seed(7)
# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10)
# evaluate the model
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
The same can be obtained using Keras# using:
// Load the Pima Indians Data Set
var pima = new Accord.DataSets.PimaIndiansDiabetes();
float[,] x = pima.Instances.ToMatrix().ToSingle();
float[] y = pima.ClassLabels.ToSingle();
// Create the model
var model = new Sequential();
model.Add(new Dense(12, input_dim: 8, activation: new ReLU()));
model.Add(new Dense(8, activation: new ReLU()));
model.Add(new Dense(1, activation: new Sigmoid()));
// Compile the model (for the moment, only the mean square
// error loss is supported, but this should be solved soon)
model.Compile(loss: new MeanSquareError(),
optimizer: new Adam(),
metrics: new[] { new Accuracy() });
// Fit the model for 150 epochs
model.fit(x, y, epochs: 150, batch_size: 10);
// Use the model to make predictions
float[] pred = model.predict(x)[0].To<float[]>();
// Evaluate the model
double[] scores = model.evaluate(x, y);
Console.WriteLine($"{model.metrics_names[1]}: {scores[1] * 100}");
Upon execution, you should see the same familiar Keras behavior as shown below:
This is posssible because Keras# is a direct, line-by-line port of the Keras project into C#. A goal of this project is to make sure that porting existing code from its Python counterpart into C# can be done in no time or with minimum effort, if at all.
Backends
Keras# currently supports TensorFlow and CNTK backends. If you would like to switch between different backends:
KerasSharp.Backends.Current.Switch("KerasSharp.Backends.TensorFlowBackend");
or,
KerasSharp.Backends.Current.Switch("KerasSharp.Backends.CNTKBackend");
or,
If you would like to implement your own backend to your own preferred library, such as DiffSharp, just provide your own implementation of the IBackend interface and specify it using:
KerasSharp.Backends.Current.Switch("YourNamespace.YourOwnBackend");
Work-in-progress
However, please note that this is still work-in-progress. Not only Keras#, but also TensorFlowSharp and CNTK. If you would like to contribute to the development of this project, please consider submitting new issues to any of those projects, including us.
Contributing in development
If you would like to contribute to the project, please see: How to contribute to Keras#.
License & Copyright
The Keras-Sharp project is brought to you under the as-permissable-as-possible MIT license. This is the same license provided by the original Keras project. This project also keeps track of all code contributions through the project's issue tracker, and pledges to update all licensing information once user contributions are accepted. Contributors are asked to grant explicit copyright licensens upon their contributions, which guarantees this project can be used in production without any licensing-related worries.
This project is brought to you by the same creators of the Accord.NET Framework.