Awesome
<p align="center"> <img src="https://raw.githubusercontent.com/denosaurs/netsaur/main/assets/netsaur.svg" width="80rem" /> <br/> <h1 align="center">Netsaur</h1> </p> <br/> <p align="center"> <a href="https://github.com/denosaurs/netsaur/stargazers"> <img alt="netsaur stars" src="https://img.shields.io/github/stars/denosaurs/netsaur?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABwAAAAcCAYAAAEFCu8CAAAABGdBTUEAALGPC/xhBQAAADhlWElmTU0AKgAAAAgAAYdpAAQAAAABAAAAGgAAAAAAAqACAAQAAAABAAAAHKADAAQAAAABAAAAHAAAAABHddaYAAABxElEQVRIDe2Wv04CQRDGAQuoTKQ2ITyADZWVJZWV+gJYWBNqKh/C16CRBlprWxsTE2NJfABNOH9z7Gzm2Nv7A8TCOMnHzs1838ze3e4ejUbMkiRZS64lP1x8MjTFr2DQE6Gl2nI+7POARXAmdbas44ku8eLGhU9UckRliX6qxM9sQvz0vrcVaaKJKdsSNO7LOtK1kvcbaXVRu4LMz9kgKoYwBq/KLBi/yC2DQgSnBaLMQ88Tx7Q3AVkDKHpgBdoak5HrCSjuaAW/6zOz+u/Q3ZfcVrhliuaPYCAqsSJekIO/TlWbn2BveAH5JZBVUWayusZW2ClTuPzMi6xTIp5abuBHxHLcZSyzkxHF1uNJRrV9gXBhOl7h6wFW/FqcaGILEmsDWfg9G//3858Az0lWaHhm5dP3i9JoDtTm+1UrUdMl72OZv10itfx3zOYpLAv/FPQNLvFj35Bnco/gzeCD72H6b4JYaDTpgidwaJOa3bCji5BsgYcDdJUamSMi2lQTCEbgu0Zz4Y5UX3tE3K/RTKny3qNWdst3UWU8sYtmU40py2Go9o5zC460l/guJjm1leZrjaiH4B4cVxUK12mGVTV/j/cDqcFClUX01ZEAAAAASUVORK5CYII=" /> </a> <a href="https://github.com/denosaurs/netsaur/releases/latest/"> <img alt="netsaur releases" src="https://img.shields.io/github/v/release/denosaurs/netsaur?logo=github" /> </a> <a href="https://github.com/denosaurs/netsaur/blob/master/LICENSE"> <img alt="netsaur License" src="https://img.shields.io/github/license/denosaurs/netsaur?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABwAAAAcCAYAAAEFCu8CAAAABGdBTUEAALGPC/xhBQAAADhlWElmTU0AKgAAAAgAAYdpAAQAAAABAAAAGgAAAAAAAqACAAQAAAABAAAAHKADAAQAAAABAAAAHAAAAABHddaYAAAC5UlEQVRIDd2WPWtVQRCGby5pVASLiGghQSxyG8Ui2KWwCfkH9olY2JneQkiR0oCIxH/gB+qVFDYBIWBAbAIRSbCRpLXwIxLiPT7vnNm9e87ZxJtUwYH3zO47Mzv7Mbv3tlo5KYriGtgAJ81OY1ENdG/YI4boFEOI911BXgY/pdtwGuAtXpvmB1tAXHDnUolE5urkPOQo6MqA3pXWmJJL4Bb4rQ7yEYfxsjnIF29NJIoNC6e5fxOL/qN+9KCz7AaLpN8zI415N2i2EptpGrkRIjGeAuvR6IY1hSFLFUOug9Ms2M7ZxIUNytm1mnME186sdI2BOCwAyQMg54ugzSmKmwbPwSbolKH+hbAtQdsOoF+BsF3anUVwBdiOWRidFZDKTTrKEAJTm3GVrGkHzw/uPZbyx7DNNLfB7KGmRsCcr+/gjaiPSpAOTyX9qG4L/XBDdWXDDf1M+wtQ5fwCOtcb4Dto6VpLmzByB6gqdHbTItGSJdAGqibJQhmRfCF7IN4beSF2G9CqnGXQrxofXU+EykllNeoczRgYytDKMubDIRK0g5MF8rE69cGu0u9nlUcqaUZ41W0qK2nGcSzr4D2wV9U9wxp1rnpxn8agXAOHMQ9cy9kbHM7ngY4gFb03TxrO/yfBUifTtXt78jCrjY/jgEFnMn45LuNWUtknuu7NSm7D3QEn3HbatV1Q2jvgIRf1sfODKQaeymxZoMLlTqsq1LF+HvaTqQOzEzUCfni0/eNIA+DfuE3KEtbsegckGmMktTXacnBHPVe687ugkpT+axCkkhBSyRSjWI2xf1KMMVmYiQdWksK9BEFiQoiYLIlvJA3/zeTzCejP0RbB6YPbhZuB+0pR3KcdX0LaJtju0ZgBL8Bd+sbz2QIaU2OfBX3BaQLsgZysQtrk0M8Sh1A0w3DyyYnGnAiZ4gqZ/TvI2A8OGd1YIbF7+F3P+B6dYpYdsJNZgrjO0UdOIhmom0nwL0pnfnzkL1803jAoKhvyAAAAAElFTkSuQmCC" /> </a> </p> <hr/>Powerful Machine Learning library for Deno
Installation
There is no installation step required. You can simply import the library and you're good to go :)
Features
- Lightweight and easy-to-use neural network library for Deno.
- Blazingly fast and efficient.
- Provides a simple API for creating and training neural networks.
- Can run on both the CPU and the GPU (WIP).
- Allows you to simply run the code without downloading any prior dependencies.
- Perfect for serverless environments.
- Allows you to quickly build and deploy machine learning models for a variety of applications with just a few lines of code.
- Suitable for both beginners and experienced machine learning practitioners.
Backends
Examples
Maintainers
- Dean Srebnik (@load1n9)
- CarrotzRule (@carrotzrule123)
- Pranev (@retraigo)
QuickStart
This example shows how to train a neural network to predict the output of the XOR function our speedy CPU backend written in Rust.
import {
Cost,
CPU,
DenseLayer,
Sequential,
setupBackend,
SigmoidLayer,
tensor2D,
} from "jsr:@denosaurs/netsaur";
/**
* Setup the CPU backend. This backend is fast but doesn't work on the Edge.
*/
await setupBackend(CPU);
/**
* Creates a sequential neural network.
*/
const net = new Sequential({
/**
* The number of minibatches is set to 4 and the output size is set to 2.
*/
size: [4, 2],
/**
* The silent option is set to true, which means that the network will not output any logs during trainin
*/
silent: true,
/**
* Defines the layers of a neural network in the XOR function example.
* The neural network has two input neurons and one output neuron.
* The layers are defined as follows:
* - A dense layer with 3 neurons.
* - sigmoid activation layer.
* - A dense layer with 1 neuron.
* -A sigmoid activation layer.
*/
layers: [
DenseLayer({ size: [3] }),
SigmoidLayer(),
DenseLayer({ size: [1] }),
SigmoidLayer(),
],
/**
* The cost function used for training the network is the mean squared error (MSE).
*/
cost: Cost.MSE,
});
/**
* Train the network on the given data.
*/
net.train(
[
{
inputs: tensor2D([
[0, 0],
[1, 0],
[0, 1],
[1, 1],
]),
outputs: tensor2D([[0], [1], [1], [0]]),
},
],
/**
* The number of iterations is set to 10000.
*/
10000,
);
/**
* Predict the output of the XOR function for the given inputs.
*/
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);
const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);
const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);
const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);
Use the WASM Backend
By changing the CPU backend to the WASM backend we sacrifice some speed but this allows us to run on the edge.
import {
Cost,
DenseLayer,
Sequential,
setupBackend,
SigmoidLayer,
tensor1D,
tensor2D,
WASM,
} from "jsr:@denosaurs/netsaur";
/**
* Setup the WASM backend. This backend is slower than the CPU backend but works on the Edge.
*/
await setupBackend(WASM);
/**
* Creates a sequential neural network.
*/
const net = new Sequential({
/**
* The number of minibatches is set to 4 and the output size is set to 2.
*/
size: [4, 2],
/**
* The silent option is set to true, which means that the network will not output any logs during trainin
*/
silent: true,
/**
* Defines the layers of a neural network in the XOR function example.
* The neural network has two input neurons and one output neuron.
* The layers are defined as follows:
* - A dense layer with 3 neurons.
* - sigmoid activation layer.
* - A dense layer with 1 neuron.
* -A sigmoid activation layer.
*/
layers: [
DenseLayer({ size: [3] }),
SigmoidLayer(),
DenseLayer({ size: [1] }),
SigmoidLayer(),
],
/**
* The cost function used for training the network is the mean squared error (MSE).
*/
cost: Cost.MSE,
});
/**
* Train the network on the given data.
*/
net.train(
[
{
inputs: tensor2D([
[0, 0],
[1, 0],
[0, 1],
[1, 1],
]),
outputs: tensor2D([[0], [1], [1], [0]]),
},
],
/**
* The number of iterations is set to 10000.
*/
10000,
);
/**
* Predict the output of the XOR function for the given inputs.
*/
const out1 = (await net.predict(tensor1D([0, 0]))).data;
console.log(`0 xor 0 = ${out1[0]} (should be close to 0)`);
const out2 = (await net.predict(tensor1D([1, 0]))).data;
console.log(`1 xor 0 = ${out2[0]} (should be close to 1)`);
const out3 = (await net.predict(tensor1D([0, 1]))).data;
console.log(`0 xor 1 = ${out3[0]} (should be close to 1)`);
const out4 = (await net.predict(tensor1D([1, 1]))).data;
console.log(`1 xor 1 = ${out4[0]} (should be close to 0)`);
Documentation
The full documentation for Netsaur can be found here.
License
Netsaur is licensed under the MIT License.