precision timing with the arduino

This is probably a simple question but I am trying to write an application that requires very precise timing. When we write the line of code:

delaymicroseconds (50);

are we truly delaying for 50 microseconds or is it 50 plus however much time it takes to run the routine called delaymicroseconds?

well there is never a "perfect" 50 mils, it's gonna be very close, probably more like 50.43296 ms, but you can't tell the diffrence! so don't worry, and i don't belive that the code stalls to execute, it just runs what it need to run , but then again you could always test it to make sure...

well there is never a "perfect" 50 mils, it's gonna be very close, probably more like 50.43296 ms, but you can't tell the diffrence! so don't worry, and i don't belive that the code stalls to execute, it just runs what it need to run , but then again you could always test it to make sure...

He's talking about the microsecond delay function though, so being off by 0.432ms, is about 800% off.

But when you say "truly delaying for 50 microseconds" there is no such thing. You have a finite level of precision in any timing device.

In this case, the answer in the documentation is "This function works very accurately in the range 3 microseconds and up. We cannot assure that delayMicroseconds will perform precisely for smaller delay-times."

I would hope that for values greater than 3uS, it will get you within a uS.

well there is never a "perfect" 50 mils, it's gonna be very close, probably more like 50.43296 ms, but you can't tell the diffrence! so don't worry, and i don't belive that the code stalls to execute, it just runs what it need to run , but then again you could always test it to make sure...

He's talking about the microsecond delay function though, so being off by 0.432ms, is about 800% off.

But when you say "truly delaying for 50 microseconds" there is no such thing. You have a finite level of precision in any timing device.

In this case, the answer in the documentation is "This function works very accurately in the range 3 microseconds and up. We cannot assure that delayMicroseconds will perform precisely for smaller delay-times."

I would hope that for values greater than 3uS, it will get you within a uS.

So the compiler is taking into account the time it takes to actually run through the lines of code in the routine delaymicroseconds and working out the calculations necessary to achieve the requested time delay(plus or minus some reasonable error). Is that a correct statement?

So the compiler is taking into account the time it takes to actually run through the lines of code in the routine delaymicroseconds and working out the calculations necessary to achieve the requested time delay(plus or minus some reasonable error). Is that a correct statement?

I'm pretty sure it takes that into account. That's where the 3uS minimum comes in, it's probably the overhead which they "calculated" through trial and error.

I'm using the function to strobe a luxeon LED for 50uS at about 5 times its rated power limit, so I would hope it's not too much longer. I haven't gotten far enough to hook it up to the oscilloscope yet, but I plan to.

Excellent! Thanks both of you for clearing that up for me.

If you need precise timing, use a spare timer interrupt. Then you only need to watch out for interrupt latency if your application is more than toggling an output's state.