Why can't I read the state of an output port?

So this:

When doing direct port i/o if you do it at ISR level.

should read:

When doing direct port i/o if you do it at both normal and ISR level.

I know about the splitting of instructions, it was just that I misunderstood that you were suggesting that doing it at interrupt level caused an issue, rather than meaning doing it at both can interrupt each other.
Of course code snippets never tell the full story - they are there to serve as an example of how to do one piece of the puzzle. Generally when I resort to access like that it is because high speed I/O is needed (e.g. bit-banging protocols). In these situations, interrupts are always disabled prior to entering a section of code as for something like software serial they can corrupt the timing.
Furthermore, I find it best to ensure that things involving registers are never messed around with within interrupts unless they are done so uniquely to interrupts - i.e. mess around with a given register in either an ISR or normal code but not both.

This is really one of the reasons I dislike using (and never do) libraries which I haven't studied the code of, you never know otherwise if these things are missed.

Totally agree with your comments.

When perforemance matters, I often do the exact same sequences you have shown and then only mask interrupts
in foreground when necessary as it allows for the best performance.

The main thing I was trying to point out that you have to be careful when you do port i/o
as atomicty can be an issue.

Another thing that some people doing raw port i/o and depending on the compiler
optimization for sbi/cbi instructions for atomicity may not realiaze is that it doesn't work for all the ports.

--- bill

bperrybap:
The main thing I was trying to point out that you have to be careful when you do port i/o
as atomicty can be an issue.

Agreed :slight_smile:

Regarding atomicity there is one thing I'd be interested to know : as the CPU is an 8-bit device when dealing with 16-bit integers is it possible for an interrupt to be serviced in between the byte transfers, i.e. if I modify a 16-bit value within an interrupt service then test it in the main program is it possible for one byte to get changed in the middle of the test?

I should say that to avoid this possibility I've ensured that only the single-byte values that get modified in the interrupt are tested in the main program (essentially the interrupt just sets and clears flags). Also, in regard to the earlier caveats regarding port modification, I only read the port in the interrupt service.

An update to a 16 bit integer is not, by its very nature on an 8 micro, atomic. ie it happens in stages. So, unless interrupts are disabled then such an update could, in theory, be interrupted part way through. However, interrupts are automatically disabled during an ISR so it could not happen then.

UKHeliBob:
interrupts are automatically disabled during an ISR so it could not happen then.

Yes, I'm aware of that. Some 16-bit variables do get modified in the ISR but I've ensured these are not tested or written to in the main program whilst interrupts are occurring. I considered going the route of briefly disabling interrupts in the main program whilst accessing these variables but decided it was simpler to just use flags to control things. Then when interrupts have stopped they can be accessed safely.

By their very nature an interrupt can occur at any time unless they are detached/disabled. How have you ensured that your 16 bit variables will not be updated or accessed in loop() other than by detaching the interrupts ?

It's quite simple. The interrupts are generated by the signal from a sensor attached to a motor. When the motor stops so do the interrupts.

OK, understood. Personally I would program for the general case where an interrupt could occur at any time and not have special cases such as the motor being stopped.