Can't dynamically change timeout period of WDT Timer of UNO Board

I would like to change the WDT timeout period at discrete levels as descried in data sheets (Fig-1) following each WDT interrupt. The following sketch is compiled, uploaded; but, the result is not as expected; the timeout seems to have stuck at 16 ms period. I have tried by putting ISR's codes under loop() function; but, the result is the same.

#include<avr/sleep.h>
#include<avr/wdt.h>
volatile byte timeoutPeriod = 0b00000000;    //16 ms timeout period

void setup()
{
  Serial.begin(9600);
  SMCR |= (1 << SM1);  //power down mode
  //-------------------------------
  noInterrupts(); //ensure no interrupt while initializing WDTCSR Reg
  wdt_reset();
  WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH
  //above bits will assume LOW with next 4-cycles
  //so keep new value for WDTCSR by the following line in buffer
  //after 4 cycles, the new value is loaded from Buffer to WDTCSR
  WDTCSR = (1 << WDIE) | (0 << WDE)    
           | (bitRead(timeoutPeriod, 3) << WDP3)
           | (bitRead(timeoutPeriod, 2) << WDP2)
           | (bitRead(timeoutPeriod, 1) << WDP1)
           | (bitRead(timeoutPeriod, 0) << WDP0);  //WDT interrupt mode; no system reset

  interrupts();
  //------------------------------------
  Serial.println("Reset/Boot");
}

void loop()
{
  bitSet(SMCR, SE);  //Sleep enable
  sleep_cpu();  //asm: sleep; the MCU sleeps for 4-sec

  bitClear(SMCR, SE);
L1: Serial.println("MCU is Wake up.");
}

ISR(WDT_vect)
{
  timeoutPeriod++;
  WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH
  WDTCSR = (1 << WDIE) | (0 << WDE) 
           | (bitRead(timeoutPeriod, 3) << WDP3)
           | (bitRead(timeoutPeriod, 2) << WDP2)
           | (bitRead(timeoutPeriod, 1) << WDP1)
           | (bitRead(timeoutPeriod, 0) << WDP0);
}


Figure-1:

You didn’t clear the wdt interrupt flag…

Is thee any WDT Interrupt Flag? What is its name and which register does contain it?

When a sketch is run with a fixed timeout period, I do not clear any flag; even there is no need to execute the wdt_reset() command. The following sketch works well with 4-sec recurrent timeout period.

#include<avr/sleep.h>
#include<avr/wdt.h>

void setup()
{
  Serial.begin(9600);
  pinMode(13, OUTPUT);

  SMCR |= (1 << SM1);  //power down mode
  //-------------------------------
  noInterrupts(); //ensure no interrupt while initializing WDTCSR Reg
  wdt_reset();
  WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH
  //above bits will assume LOW with next 4-cycles
  //so keep new value for WDTCSR by the following line in buffer
  //after 4 cycles, the new value is loaded from Buffer to WDTCSR
  WDTCSR = (1 << WDIE) | (0 << WDE)|bit(WDP3); //4-sec timeout; interrupt mode;
          
  interrupts();
  //------------------------------------
  Serial.println("Reset/Boot");
}

void loop()
{
  bitSet(SMCR, SM1);   //Power-down mode
  bitSet(SMCR, SE);  //Sleep enable
  sleep_cpu();  //asm: sleep; the MCU sleeps for 4-sec

  bitClear(SMCR, SE);
  Serial.println("MCU is Wake up.");
  digitalWrite(13, HIGH);
  delay(100);
  digitalWrite(13, LOW);
  delay(100);
}

ISR(WDT_vect){}

Output:

11:25:42.401 -> MCU  (Reset/Boot)
11:25:46.790 -> MCU is Wake up.
11:25:51.159 -> MCU is Wake up.
11:25:55.495 -> MCU is Wake up.

WDIF in WDTCSR.

Bit 7 – WDIF: Watchdog Interrupt Flag
This bit is set when a time-out occurs in the Watchdog Timer and the Watchdog Timer is configured for interrupt. WDIF is cleared by hardware when executing the corresponding interrupt handling vector. Alternatively, WDIF is cleared by writing a logic one to the flag. When the I-bit in SREG and WDIE are set, the Watchdog Time-out Interrupt is executed.

1 Like

In the sketch of post #1, there is vectored interrupt; so, the WDIF-flag must have been automatically cleared.

In case of polled interrupt, which codes block of the following two should I execute to clear the WDIF flag?

1. 
bitSet(WDTCSR, WDIF);

2.
WDTCSR |= (1 << WDCE) | (1 << WDE);
WDTCSR = (0 << WDE) | (1 << WDIF);

WDIF in WDTCSR
However, it might get cleared automatically (not the same as Tiny25!)
OTTH, apparently if you want continued interrupts, you need to set WDIE after each trigger...

11.9.2 WDTCSR – Watchdog Timer Control Register
• Bit 7 – WDIF: Watchdog Interrupt Flag
This bit is set when a time-out occurs in the Watchdog Timer and the Watchdog Timer is configured for interrupt.
WDIF is cleared by hardware when executing the corresponding interrupt handling vector. Alternatively, WDIF is cleared by writing a logic one to the flag. When the I-bit in SREG and WDIE are set, the Watchdog Time-out Interrupt is executed.
• Bit 6 – WDIE: Watchdog Interrupt Enable
When this bit is written to one and the I-bit in the Status Register is set, the Watchdog Interrupt is enabled. If WDE is cleared in combination with this setting, the Watchdog Timer is in Interrupt Mode, and the corresponding interrupt is executed if time-out in the Watchdog Timer occurs.
If WDE is set, the Watchdog Timer is in Interrupt and System Reset Mode. The first time-out in the Watchdog Timer will set WDIF. Executing the corresponding interrupt vector will clear WDIE and WDIF automatically by hardware (the Watchdog goes to System Reset Mode). This is useful for keeping the Watchdog Timer security while using the interrupt. To stay in Interrupt and System Reset Mode, WDIE must be set after each interrupt. This should however not be done within the interrupt service routine itself, as this might compromise the safety-function of the Watchdog System Reset mode. If the interrupt is not executed before the next time-out, a System Reset will be applied.

1 Like

1. For some reasons, the following codes do not work practically as expected in order to change the timeout period dynamically by incrementing the timeoutPeriod variable..

    byte timeoutPeriod = 0b00001000; //4-sec timeout
    WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH
    WDTCSR = (1 << WDIE) | (0 << WDE)
             | (bitRead(timeoutPeriod, 3) << WDP3)
             | (bitRead(timeoutPeriod, 2) << WDP2)
             | (bitRead(timeoutPeriod, 1) << WDP1)
             | (bitRead(timeoutPeriod, 0) << WDP0);   //WDT interrupt mode; no system reset

2. But, the following codes work:

    WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH
    WDTCSR = (1 << WDIE) | (0 << WDE)|(1<<WDP3);  //4 sec timeout

That whole bitread and OR stuff doesn't fit in four clock cycles. Decide your next timeout period before this line and set it afterwards:
WDTCSR |= (1 << WDCE) | (1 << WDE);//both bits must be HIGH

Recall that (some?) changes made to WDTCSR have to happen within 4 clock cycles of setting WDCE. Check the disassembly to see how many instructions you have between them.

Also:

WDCE: Watchdog Change Enable
This bit must be set when the WDE bit is written to logic zero. Otherwise, the Watchdog will not be disabled. Once written to one, hardware will clear this bit after four clock cycles. Refer to the description of the WDE bit for a Watchdog disable procedure. This bit must also be set when changing the prescaler bits.

(the documentation is pretty awful, I think. Sigh. I still can't' figure out for sure whether you need to use WDCE when enabling WDE.)

Now, I know the reason why my sketch of post #1 is not capable of changing the WDT's timeout period dynamically.

Thanks to both @amazed and @westfw to extend assistance in my studies.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.