gunjah292:
Now I need to implement a more complex code and was advised to use interrupts.
Generally bad advice, but with no actual information about your code, it is hard to say.
gunjah292:
I just put a delay of 10ms and read the value again to make sure, that I am not reading a bounce.
Well, "just put a delay of 10ms" is a very dubious statement. You most certainly cannot use "delay()" inside an interrupt for two reasons - it won't work, and if it did, it would defeat the whole purpose of an interrupt.
You poll the decoder lines with every loop() cycle. If you see a change, you make a note of the status and the millis() time. On each successive pass, you review the lines against that status; if it reverts you cancel the status and consider it bounce or noise. If however the new status persists for an advance of the millis() of 5, you consider it stable and act on it. In the case of the quadrature encoder, you follow the same process independently for the two lines.
This means that a change is only considered valid where it persists for every pass through the loop within a 5 ms interval. If it should happen that the alternate line also changes within that 5 ms interval, its state is only validated 5 ms after it is stable so the all-important sequence of the two changes is preserved.
This is the best possible way to accurately read the encoder. If the bounce time were to exceed the step time of the encoder, it would be completely impossible to decode it in any case, so there is necessarily a limit on just how fast the encoder can be read. You clearly cannot use delay() or while loops anywhere in your code, it is based on the loop() running without impediment.
And you could attempt to implement the debounce algorithm using interrupts and millis() (which is valid inside an interrupt) but you would of course need to reliably detect every pin change - difficult if as a result of a bounce, it happens within the interrupt execution itself - in order to reject changes that did not persist for the required 5 ms, and more to the point, once the change is stable, it no longer generates interrupts, so you cannot know when it has timed out at 5 ms. You have an inherent paradox.