Delay function in microseconds in C programming

Hello! to C programming experts out there;

I have a loop to record 500 sound level samples into an array but I need to store
each sample every 200 microsecond.
Do you know a simple way to accomplish this. I thought someone wrote a microsecond delay function
a year ago or more.
Thank you,


check delayMicroseconds() ?
it is part of the language for ages …

You must take into account how long it takes to make a sample. E.g. taking the sample takes 92 uSec than you should

for (int i=0; i< 500; i++)

The demo Several Things at a Time illustrates the use of millis() to manage timing without blocking. The same technique can be used with micros().

Another advantage of not using delay() or delayMicroseconds() is that with the technique in the tutorial your code will automatically start every sample after 200 µsecs regardless of how long the sampling takes (within reason).


Got it!

Thank you very much guys.

Pierre Salsiccia, PE , ME