Diwa is a lightweight library providing a simple implementation of Feedforward Artificial Neural Networks (ANNs) for microcontrollers such as ESP32, ESP8266, RP2040, and similar development boards (specially boards with PSRAM); it is also compatible for desktop environments (Windows, MacOS, and Linux-based OSes), WebAssembly, and even PSP gaming consoles. It is designed for resource-constrained environments but can be used with non-Arduino platform projects, offering a streamlined solution for tasks that require neural network capabilities.
Diwa stands out as a straightforward and effective solution for implementing artificial neural networks on microcontrollers. Key features include:
- Lightweight: Designed for resource-constrained microcontroller environments yet can still be used within non-Arduino environments.
- Simple Implementation: Provides a basic yet effective implementation of a Feedforward ANN.
- Easy Integration: Suitable for microcontrollers like ESP8266, ESP32, and RP2040. Also compatible with desktop environments, WebAssembly, and even PSP gaming console.
- Training Support: Includes methods for training the neural network using backpropagation.
Note
Diwa is primarily intended for lightweight applications. For more intricate tasks, consider using advanced machine learning libraries on more powerful platforms.
See live demo on Wokwi.
Diwa are tested on the following architecture/platform:
Arch/Platform | Remarks |
---|---|
✅ ESP32-WROOM ✅ ESP32-WROVER |
NodeMCU DevKit (Automatically using PSRAM available) |
✅ ESP8266 | Wokwi Emulation |
✅ RP2040 | Raspberry Pi Pico (RP2040) |
🔼 PSP | PPSSPP Emulator (Diwa::loadFromFile and Diwa::saveToFile not yet supported) |
✅ Desktop Environments | Works perfectly on Windows, MacOS, and Linux. |
✅ WebAssembly (WASM) | Tested via Emscripten |
You can have a quickstart by running the Docker image.
docker run nthnn/diwa:0.0.7
To start using Diwa library in your Arduino projects, follow these simple steps:
- Open your Library Manager on Arduino IDE.
- Type
diwa
and click "Install."
Alternatively, you can follow the steps below:
- Download the Diwa library from the GitHub repository.
- Extract the downloaded archive and rename the folder to "diwa".
- Move the "diwa" folder to the Arduino libraries directory on your computer.
- Windows:
Documents\Arduino\libraries\
- MacOS:
~/Documents/Arduino/libraries/
- Linux:
~/Arduino/libraries/
- Windows:
- Launch the Arduino IDE.
Using Diwa on C++ projects might be quite different depending on what build tools you are using.
- Include the Diwa git repository as submodule on your project's
lib
folder.
git submodule add https://github.com/nthnn/diwa.git
To do this, your project must be a git repository.
- Add the
lib/diwa/src
folder on your project'sMakefile
. - Add the
*.cpp
files on the build source files.
Note
Alternatively, you can download the *.deb
file from the release page to install Diwa as a shared library on Debian-based operating systems.
Same on every C++ projects, depending on your build process, this might be quite different on some instances.
- Add
--std=c++17
on the compiler arguments. - Include the
src
folder on the include arguments. - Finally, include the
*.cpp
files from thesrc
folder on compilation.
Or you can check the examples/psp_xor/build.bat for reference.
To access the examples:
- Open the Arduino IDE.
- Click on
File > Examples > diwa
to see the list of available examples. - Upload the example sketch to your Arduino board and see the results in action.
Here's a full example usage for an Arduino environment:
#include <diwa.h>
void setup() {
// Initialize serial communication with a baud rate of 115200
Serial.begin(115200);
// Define training input and output data for XOR operation
double trainingInput[4][2] = {{0, 0}, {0, 1}, {1, 0}, {1, 1}};
double trainingOutput[4][1] = {{1}, {0}, {0}, {1}};
// Create an instance of the Diwa neural network with 2 input neurons,
// 1 hidden layer with 3 neurons, and 1 output neuron
Diwa network;
if(network.initialize(2, 1, 3, 1) == NO_ERROR)
Serial.println(F("Done initializing neural network."));
else {
Serial.println(F("Something went wrong initializing neural network."));
while(true);
}
// Train the network for 3000 epochs using the XOR training data
Serial.println(F("Starting training..."));
for(uint32_t epoch = 0; epoch < 10000; epoch++) {
// Train the network for each set of input and target output values
network.train(6, trainingInput[0], trainingOutput[0]);
network.train(6, trainingInput[1], trainingOutput[1]);
network.train(6, trainingInput[2], trainingOutput[2]);
network.train(6, trainingInput[3], trainingOutput[3]);
// Show accuracy and loss on training for every 100th epoch
if(epoch % 1000 == 0) {
double accuracy = 0.0, loss = 0.0;
// Calculate accuracy and loss for each training sample
for(uint8_t i = 0; i < 4; i++) {
accuracy += network.calculateAccuracy(trainingInput[i], trainingOutput[i], 3);
loss += network.calculateLoss(trainingInput[i], trainingOutput[i], 3);
}
// Average accuracy and loss over all samples
accuracy /= 4, loss /= 4;
// Print the accuracy and loss
Serial.print(F("Epoch: "));
Serial.print(epoch);
Serial.print(F(", accuracy: "));
Serial.print(accuracy * 100);
Serial.print(F("%, loss: "));
Serial.print(loss * 100);
Serial.println(F("%"));
}
}
Serial.println(F("Training done!"));
// Perform inference on the trained network and print the results
Serial.println(F("Testing inferences..."));
for(uint8_t i = 0; i < 4; i++) {
// Get the current input row
double* row = trainingInput[i];
// Perform inference using the trained network
double* inferred = network.inference(row);
// Print the result for the current input
char str[100];
sprintf(str, "Output for [%1.f, %1.f]: %1.f (%g)\n", row[0], row[1], inferred[0], inferred[0]);
Serial.print(str);
}
}
void loop() {
delay(10);
}
Contributions and feedback are all welcome to enhance this library. If you encounter any issues, have suggestions for improvements, or would like to contribute code, please do so.
Copyright 2023 - Nathanne Isip
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.