Hey there!
I've encountered a problem that I have no idea how to fix. I have an array consisting of three int's where I am trying to pull out the highest, the lowest and the one in the middle.
In order to do this, I am using the average library. The plan is to use the functions to find the highest and the lowest, and afterwards finding the one in the middle by subtracting the highest and the lowest from the sum.
My test code looks like this:
#include <Average.h>
int d[] = {3, 8, 3};
void setup()
{
Serial.begin(9600);
int max1 = (maximum(d,3),DEC);
int min1 = (minimum(d,3),DEC);
int mid1 = d[0]+d[1]+d[2]-max1-min1;Serial.println(" Max: ");
Serial.println(maximum(d,3),DEC);
Serial.println(" Min: ");
Serial.println(minimum(d,3),DEC);
Serial.println(" Midt: ");
Serial.println(mid1);
}
Here comes the weird part.. When printing the maximum and minimum values, I get the right value. But when defining the value as an int, it always returns 10.
Serial.println(maximum(d,3),DEC); // Printing 8..
int max1 = (maximum(d,3),DEC);
Serial.println(max1); // Printing 10..
Why?! :o