FFT behaviour when everything is silent

Hello,
as the heading says I am a little bit confused whether or not my problem is completely normal or not.

So I am currently trying to visualise some music on WS2812B LED strips with the Adafruit Neopixel library and in order to capture the music I use the MAX9814 microphone, which is from Adafruit as well. I process the sound with this library:

I will post the code to obtain the frequency using FFT below. My questions are:

  1. Is it normal that when there is no music/noise at all, the values kinda "jump" and are pretty random? If not, do you have any ideas on what's wrong with the code?
  2. Since I am not 100% sure how the algorithm of FFT completely works I would like to know why the higher the 'SAMPLING_FREQUENCY' the higher the base value when there is silence. For example, if I set it to 1000, I will get a value of around 50 or so when there is silence. So it prints out 50Hz to be the major peak in complete silence. If I set it to 5000 or more however, the major peak is considered to be higher, like at 200Hz.
  3. It would also be interesting to know why there is a comment next to that which says that the Hz must be less than 10.000 due to ADC. Why is that the case?
#include "arduinoFFT.h"

#define SAMPLES 512             //Must be a power of 2
#define SAMPLING_FREQUENCY 7500 //Hz, must be less than 10000 due to ADC

arduinoFFT FFT = arduinoFFT();

unsigned int sampling_period_us;
unsigned long microseconds;

double vReal[SAMPLES];
double vImag[SAMPLES];

void setup() {
    Serial.begin(115200);

    sampling_period_us = round(1000000*(1.0/SAMPLING_FREQUENCY));
}

void loop() {
    /*SAMPLING*/
    for(int i=0; i<SAMPLES; i++)
    {
        microseconds = micros();    //Overflows after around 70 minutes!

        vReal[i] = analogRead(0);
        vImag[i] = 0;

        while(micros() < (microseconds + sampling_period_us)){
        }
    }

    /*FFT*/
    FFT.Windowing(vReal, SAMPLES, FFT_WIN_TYP_HAMMING, FFT_FORWARD);
    FFT.Compute(vReal, vImag, SAMPLES, FFT_FORWARD);
    FFT.ComplexToMagnitude(vReal, vImag, SAMPLES);
    double peak = FFT.MajorPeak(vReal, SAMPLES, SAMPLING_FREQUENCY);

    /*PRINT RESULTS*/
    //Print out what frequency is the most dominant.
    Serial.println(peak);
    delay(100);  //Repeat the process over and over
}

Oh, and by the way: Im using a NodeMCU and not an Arduino, if that matters.

I know that those are a lot of questions, but it would be nice if there is anyone, who can answer one of them. :slight_smile:

Thanks in advance.

I don't know the details of the FFT library but if you disconnect the microphone and ground the analog input I'm betting the FFT will give all zeros...

The real world is never perfectly silent and your microphone board has automatic gain control so when it's very-quiet the gain will automatically get cranked-up to amplify the background noise.

The randomness will be noise floor from the microphone - should be fairly flat apart from
multiples of mains frequency and things like the LED update frequency (its really hard
to prevent interference from digital circuitry getting into a low-level microphone
signal if they share the same supply.

I also note that the MAX9814 microphone amp has a fairly high noise figure too, of 30nV/sqrt(Hz),
which the datasheet describes as "low noise density" - the state of the art is way better than that.

When the sampling rate increases for an FFT with a fixed number of bins, the frequency
label for each bin increases too. I'm not sure if that fully explains what you see.

For 512 point FFT you are using you should get 257 bins output, for fs=1000 that means 2Hz resolution
from 0 to 500Hz.

I'd add a load of extra decoupling to the supply to the MAX module and see if that has an effect.

BTW your timing loop has a classic flaw:

        while(micros() < (microseconds + sampling_period_us)){
        }

needs to be

        while(micros() - microseconds < sampling_period_us){
        }

to be valid across the micros() wraparound - always subtract two times, then compare the difference.
Then nothing wrong happens at 70 minutes. The difference will always be correct.

Your loop can be more accurate too:

lose this line:

        microseconds = micros();    //Overflows after around 70 minutes!

and change your delay loop to:

        while(micros() - microseconds < sampling_period_us){
        }
        microseconds += sampling_period_us

Then your timepoints are exactly right - the only time you should set microseconds to the result
of micros() is at the start. Thereafter the points in time are just increments.

[ BTW the comment about the max ADC sample rate is due to the ADC sample time
on ATmegas being 110µs - so you can't get more than about 9kSPS without changing
some settings ]

Thank you very much for these explanations. And also thanks for tips on how to improve the code! :slight_smile: