Automatic range fitting noise sensor output > illuminate LEDs

I want to map a noise sensor's output, minimum and maximum values 30 and 130, to a string of 24 LEDs. The sensor will be used along a road with tram tracks, cobblestones and tarmac; and depending on location, the output values fluctuate 60 - 75, or 35 - 80, or 50 - 115, and so forth. Sometimes, the amplitude change can be small, sometimes large. At the moment, I hard code lower and upper boundary values (59 and 76) for map() so that ideally all 24 LEDs are used (orange in the screenshot), but that is not a good solution. How would I go about automating: output low - output high > 0 - 23?

#include <RunningMedian.h>
RunningMedian runningMedian = RunningMedian(5); // Window size

// Variables that remain constant
const byte pinSEN0232 = A0;

// Variables that can change
float voltage = 0;
float raw = 0;
float median = 0;
float average = 0;
int scaled = 0;

void setup()
{
  Serial.begin(115200);
}

void loop()
{
  voltage = analogRead(pinSEN0232) / 1024.0 * 5; // 5V microcontroller
  raw = voltage * 50.0;

  runningMedian.add(raw);
  
  median = runningMedian.getMedian();
  average = runningMedian.getAverage(5);
  scaled = constrain(map(average, 59, 76, 0, 23), 0, 23); // Later LED numbers 0 - 23

  Serial.print("raw:");
  Serial.print(raw);
  Serial.print(",");

  Serial.print("median:");
  Serial.print(median);
  Serial.print(",");

  Serial.print("average:");
  Serial.print(average);
  Serial.print(",");

  Serial.print("scaled:");
  Serial.println(scaled);

  delay(10);
}

Just saw that Rob Tillaart's library has getLowest() and getHighest() methods. Used in map() as lower and upper boundary, with a large running median window, scales the output into the 0 - 23 range quite well (screenshot).

But maybe I should rather choose a longer time interval instead, say 10 seconds, and use the minimum and maximum found in that interval as lower and upper boundaries? So I start with 30 and 130 once powered, and then the upper and lower boundary are updated every 10 seconds (the LEDs thus become a somewhat lagging indicator)? Would that be better?

What's the "professional" word for this? Dynamic ranging? Auto-scaling? Range-fitting?

Are you trying to increase the high and decrease the low dynamically?

  newRead = analogRead(pinSEN0232);

  if (newRead > newHigh)
    newHigh = newRead;
  if (newRead < newLow)
    newLow = newRead;

  newAve = newHigh + newLow / 2;
  scaled = map(newAve, newLow, newHigh, 0, 23);

Not increasing the high and decreasing the low, but dynamically shifting both up or down, as the noise level changes, so that the whole "bandwidth" of 0 - 23 is always used:

Noise range > LED range

32 - 47 > 0 - 23
74 - 96 > 0 - 23
81 - 124 > 0 - 23
61 - 85 > 0 - 23
38 - 121 > 0 - 23
...

See the 1st screenshot, where the upper and lower boundaries are roughly hard-coded for one particular situation: The orange graph is scaled into a range of 7 - 18, not 0 - 23, but follows the original signal well. If the noise level changes, I would have to hard-code the upper and lower boundaries again. And so forth.

The getLowest() and getHighest() methods, based on the size of the running median window, do scale, but the relation to the original signal is more or less lost (window size 27 samples). Hence the good old min/max calibration you suggest, not in setup(), but in loop(), may be better.

Tried to simulate the dynamic lower and upper boundary updating in WOKWI. Is that a good approach?

const int updateInterval = 4000; // Time between boundary updates in milliseconds
unsigned long timeOfUpdate = 0; // Timestamp for when the boundary update happened

byte boundaryLow = 130;
byte boundaryHigh = 30;

void setup()
{
  Serial.begin(9600);

  randomSeed(analogRead(0));
}

void loop()
{
  byte sensorOutput = random(30, 131); // To fake the noise sensor's output

  if (sensorOutput < boundaryLow)
  {
    boundaryLow = sensorOutput;
  }

  if (sensorOutput > boundaryHigh)
  {
    boundaryHigh = sensorOutput;
  }

  //scaled = constrain(map(average, boundaryLow, boundaryHigh, 0, 23), 0, 23);

  updateBoundaries();

  delay(400); // So 30 and 130 don't occur almost instantly
}

void updateBoundaries()
{
  // Is it time to update the boundary values?
  if (millis() - timeOfUpdate >= updateInterval)
  {
    Serial.print(boundaryLow);
    Serial.print(" | ");
    Serial.print(boundaryHigh);

    Serial.println(" Boundary values updated");

    boundaryLow = 131;
    boundaryHigh = 30;

    // Update the timestamp with the time when the boundary update happened
    timeOfUpdate = millis();
  }
}

Seems to work rather nicely. Sort of like an "on the fly calibration", or whatever this sort of thing is called. Updating the lower and upper boundaries every four seconds, and now the mapping uses the full 0 - 23 range. Let's see how this plays out with LEDs and some LED code.

I see...

Maybe this...

  • discard oldLow/oldHigh
  • set newLow to the 1023 (maximum reading at opposite end of ADC)
  • newHigh to 0 (minimum reading at opposite end of ADC)
  • read newLow/newHigh
  • map new high/low with scaled = map(newAve, newLow, newHigh, 0, 23);

See above, seems to work. Every n seconds, the lower and upper boundary is reset, as the noise volume fluctuates over time. But I have to test it for, say, an hour, and with LEDs and LED code. And then see what the boundary update interval, now arbitrarily set to four seconds, should be.

Ok.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.