Go Down

Topic: Simple MLP - NeuralNetwork Library (Read 3618 times) previous topic - next topic

giorgos_xou

#15
May 30, 2019, 04:06 pm Last Edit: May 30, 2019, 04:15 pm by Giorgos_xou
Just tested on a Maple Mini STM32F103 at 72MHZ, Backpropagation_double_Xor example.
Nice! i will add it in the "tested" section [...] (with a reference-link [tested by] to your profile here? if it is fine/ok by you)

On ESP8266 at 80MHz I added some yeld(); here
Fisrt time i hear about yeld(); i didn't know what it was, I searched about it and it is pretty interesting..

It took about 131 seconds
I am sure that can be way faster, If i wasn't deleting and recreating some arrays for SRAM-economy (i just have to have in my mind, to set a property for this too, in the future) but still, 2 minutes doesn't seem that fine for 3000 loops at 80MHz, maybe yeld plays a role too, because if i am not wrong arduino was around 2 minutes too , hmm...but still Thumps-Up for testing! (:

zoomx

(with a reference-link [tested by] to your profile here? if it is fine/ok by you)
Yessss :-)
Same sketch on Arduino took about 803 seconds, more that 13 minutes. Loops are 8000!

giorgos_xou

#17
May 30, 2019, 06:29 pm Last Edit: May 30, 2019, 06:41 pm by Giorgos_xou
Yessss :-)
(:

Same sketch on Arduino took about 803 seconds, more that 13 minutes.
!!! O:

Loops are 8000!
You are right! i thought i had 3000 but 3000 was in the other example...

if you play with the learning rates, changing them to a higher number you can have ~3000 loops and still get the ~same outputs.

I should have change the learning rates for this sketch,  because it is actually quite slow (and people might think that it doesn't work)

zoomx

Maybe you can use a variable learning rate, higher when start and going down when learning increase.

giorgos_xou

Maybe you can use a variable learning rate, higher when start and going down when learning increase.
True, however i 'll see what i will do until the next version-update..
( Thanks you for helping me out and for your interest, means a lot to me (: )

josepramon

Hi Giorgos.
Thanks for your work. It is very interesting and useful! I tried your library in an Arduino nano 33 iot and works fine and in my opinion fast. I tested your example of Simple XOR with 5000 iterations.
But I have a question: how is possible to use the weights calculated in 'Void Status' inside of 'Void Loop' ? It is normal that weights has been calculated one time in Void Status. However, how I can use it in Void Loop? or I should create an Algorithm like W1×X1 + W2×X2 +... +Wn×Xn? Is there a way or function to get the outputs every time that change the input? Thanks!

giorgos_xou

#21
Feb 27, 2020, 10:54 pm Last Edit: Feb 27, 2020, 10:57 pm by Giorgos_xou
Hi Giorgos.
Hi @josepramon

Thanks for your work. It is very interesting and useful!
Thank you for your interest and for your appreciation, means a lot to me <3  (:
(and i am sorry for my late response time ..)

I tried your library in an Arduino nano 33 iot and works fine and in my opinion fast.
Nice (:

how is possible to use the weights calculated in 'Void Status' inside of 'Void Loop' ? ... Is there a way or function to get the outputs every time that change the input?
Yes it is possible, Here Is a Brief Example how:
Code: [Select]

#define NumberOf(arg) ((unsigned int) (sizeof (arg) / sizeof (arg [0]))) //calculates the amount of layers (in this case 3)

#include <NeuralNetwork.h>

NeuralNetwork *NN; // creates a pointer to -> an object of NeuralNetwork in the RAM


const unsigned int layers[] = {2,4,1}; //3 layers (1st)layer with 2 input neurons (2nd)layer with 4 hidden neurons and (3rd)layer with 1 output neuron
float *outputs; // 3rd layer's outputs (in this case output)

//Default Inputs [for Training only]
const float inputs[4][2] = {
  {0, 0}, //0
  {0, 1}, //1
  {1, 0}, //1
  {1, 1}  //0
};
const float expectedOutput[4][1] = {{0},{1},{1},{0}}; // values that we were expecting to get from the 3rd/(output)layer of Neural-network, in other words something like a feedback to the Neural-network.


void setup(){
 
  Serial.begin(9600);

  NN = new NeuralNetwork(layers,NumberOf(layers)); //Initialization of NeuralNetwork object
 
  //Trains the NeuralNetwork for 3000 epochs = Training loops
  for(int i=0; i < 3000; i++) // epochs = Training loops
  {
    for (int j = 0; j < NumberOf(inputs); j++)
    {
       NN->FeedForward(inputs[j]); // Feeds-Forward the inputs to the first layer of the NN and Gets the output.
       NN->BackProp(expectedOutput[j]); // Tells to the NN if the output was right/the-expectedOutput and then, teaches it.
    }
  }

   NN->print(); // prints the weights and biases of each layer

}


float INPUT_[1][2]; // Dynamic/Changable Input variable
void loop() {

  //As a brief example, here you could have a live input from two buttons/switches (or a feed from a sensor if it where a different NN)
  INPUT_[0][0] = 0; // ... lets say input from the 1st button/Switch?
  INPUT_[0][1] = 1; // ... lets say input from the 2nd button/Switch?
 
  outputs = NN->FeedForward(INPUT_[0]); // Feeds-Forward the Dynamic INPUT_[] to the first layer of the NN and Gets the output
  Serial.println(outputs[0], 7); // prints the first 7 digits after the comma.

  delay(1000);
 
}




Truth is that i should have more examples in my library...

PS. for everyone here, i am thinking of making some upgrades in the next months...

josepramon

Hi Giorgos!
Thanks for your help. Your library is fantastic!
JR


zoomx


Go Up