The documentation AttachInterrupt states
"Serial data received while in the function may be lost."
If this statement was refering to Software Serial, I would understand.
But how does the call of an interrupt service routine - while the microcontroller is performing some (hardware) Serial, I2C and SPI communications in both send and receive direction - affect these communications?
I doubt these functions use millis(). So what is the reason for the "may be lost" statement? Do they make use of some pin-state-change interrupt that will be disabled during the call of a different interrupt?
im still just a noob, probably below you in skill, but it sounds like if you tell it to receive data, it may take in data from I2C and then when it gets Serial info, it might be overwriting the I2C data it has stored, basically, it sounds like your telling it to take in too much at a time
When an interrupt is triggered it turns off interrupts until the ISR completes. The Atmel chip can record one (I think) other interrupt while the ISR is in progress and it will implement that when the first ISR completes.
Communication uses interrupts and if your ISR takes too long it may be that some of the communication interrupts are missed.
The more I think about it, the more questions arise.
So you (@Robin2) are saying that all communication protocols ares affected - but why only the receive part?
Is the TX part using a "non-interrupted timer" such as micros()?
On a second thought: why should I2C and SPI be affected - since the µC provides the clock, there might be a "pause" in a SPI / I2C transmission where the clock changes later than normal, but no harm done.
For the argument's sake, let's assume a bit length of 10 us and ISR duration of 11 us.
If the communication has one transition of state while the ISR is in process: Will this change be noticed, and if so with the correct time or an error of up to 11µs?
What if two transitions occur, will these be noticed at all?