strange behavior in time measurement.

Thank you for your replies.

PaulS:
We'd need to see all of your code. It looks _prevTime is somehow larger than now (as defined my micros).

When i include the out commented lines it workes perfectly. _prevTime is fine as well as micros() and dif. But if i print the value "dif" in the if-condition it changes it value.
I will post the code but give me some time to clean it up

DavidOConnor:
Try disabling interrupts before you calculate dif and enable them afterwards.

I will try this and will report the results.

GoForSmoke:
10 micros is probably way too short. 10 millis is closer to practical.

The micros() function rounds up to 4.

From your results it looks like you are subtracting the now from the previous or getting them in the wrong order.

You are using unsigned long and not long anywhere in there?
You are ending all constant values for unsigned long with UL? Ex: const unsigned long 10UL;

If there is no signal for, lets say 10.000 micro seconds than turn off a led.

Please say that you are not using float or double. Please.

10.000 is not a float or double. It is ten thousand just for better reading. = 10 ms.
Lets say the last measurement was 5000 so _prevTime =5000. In the loop he takes the current time and calculates the difference. If the current time is 6000 the difference will be dif=1000 if the dif is greater then 10.000 (ten thousand) it will turn off a led or what ever. The order is correct, the results are fine if i just include the out commented lines. Thats the strange thing i cant understand.
All variables i use to calculate the time are volatile unsigned long. And they don't end with UL. I have to google it, never heard about it.