Initializing timer registers on 16bit counter?

I've done some further checks with LEDs as indicators of HIGH/LOW bytes in a 16bit variable (unsigned int). I've also compiled it outside the Arduino IDE and the behaviour is the same. Still the assembler code looks ok as far as the order of reads/writes is concerned.

It boils down to this:

First I stop TIMER1 by setting the prescaler accordingly. Then it let the compiler write 0xFFFF into TCNT1 (16bit reg), assuming it knows how to do it correctly and instantly read it back into an uint16_t variable called timer1. I use 8 LEDs to show the LOW/HIGH byte contents of timer1 and print the results to serial as well.

The LOW byte content of timer1 is correct (0xFF) and 8 LEDs light up. The HIGH byte though is ZERO. If i replace the readout of TCNT1 into timer1 with just "timer1 = 0xFFFF", LOW/HIGH byte content of timer1 is correct and the 8 LEDs light up twice. The serial output is correct as well in this case.

So apparently reading TCTN1 into timer1 fails. Manually reading LOW/HIGH of TCNT1 using TCNT1L and then TCNT1H fails as well.

#include <util/delay.h>
#include <stdint.h>
#include <avr/io.h>
#include <avr/interrupt.h>

void
setup (void)
{
  Serial.begin (9600);

  DDRD |= ((1 << PD5));            // PD5 output
  PORTD |= ((1 << PD5));      // RED anodes HIGH. All 8 anodes go to this pin
  DDRB = 0xFF;                  // all outputs
  PORTB = 0xFF;                  // all 8 cathodes HIGH --> OFF
}

void
loop (void)
{

  unsigned char sreg;
  volatile unsigned int timer1;
  unsigned char ctr;

  sreg = SREG;                  // store IRQ flags
  cli ();                  // all IRQs off
  TCCR1B &= ~((1 << CS12) | (1 << CS11) | (1 << CS10));      // stop timer1
  TCNT1 = 0xFFFF;            // set arbitrary value, should stay the same as timer1 is stopped
  timer1 = TCNT1;            // read it back. the compiler should know how to do a 16bit read
  //timer1 = 0xFFFF;            // use this instead of reading TCNT1 and it works as expected
  SREG = sreg;                  // restore IRQ flags

  for (ctr = 0; ctr <= 7; ctr++)
    {                        // show LOW byte of timer1
      if ((timer1 >> ctr) & 0x0001)
      {
        PORTB &= ~(1 << ctr);      // turn the LED on
      }
      else
      {
        PORTB |= (1 << ctr);      // turn the LED off
      }
    }
  _delay_ms (300);
  PORTB = 0xFF;                  // all off
  _delay_ms (300);

  for (ctr = 0; ctr <= 7; ctr++)
    {                        // show HIGH byte of timer1
      if ((timer1 >> (ctr + 8)) & 0x0001)
      {
        PORTB &= ~(1 << ctr);      // turn the LED on
      }
      else
      {
        PORTB |= (1 << ctr);      // turn the LED off
      }
    }
  _delay_ms (300);
  PORTB = 0xFF;                  // all off

  Serial.print ("timer1 (should be FFFF) : ");
  Serial.println (timer1, HEX);
  Serial.print ("timer1>>8 & 0xFF (high byte, should be FF): ");
  Serial.println ((timer1 >> 8) & 0xFF, HEX);
  Serial.print ("timer1 & 0xFF (low byte should be FF): ");
  Serial.println (timer1 & 0xFF, HEX);
  Serial.println ("-");
  delay (3000);
}

What the @$& is going on ?

BTW,

Happy Holidays everybody.