I was recently tinkering with some things and noticed a couple of issues with delayMicroseconds(). The first is a problem of calling it with an argument of 0 leading to a ~17mS delay. This is apparently well known and will be fixed at some point or documented on the site I suppose.
The real crux of my post is that delayMicroseconds() seems to be adding ~4uS of extra delay to every call. I understand about short delays not being exceedingly accurate, but no matter what I call delayMicroseconds() with, the delay is 4uS longer than asked. I tested with values up to 20uS and the delay is always off by 4uS.
I am using 1.0.3 on an Uno R3 board. I have verified my results with an oscilloscope and am confident of the problem. I can provide a sample sketch and some captures from the scope if necessary, but it should be easy for anyone to verify that has the proper test equipment.