Interrupt delay execution

Hi,
Can anyone please tell me if it's possible to interrupt a delay function that is executing and reset the machine state. Say my function flips pin states for a certain time and it uses delay function but when I press a button it should quit executing the delay function. Is this possible?

It's possible with interrupts, but would be better to avoid using delay() entirely.

Search this site for "non-blocking delay". That should help.

1 Like

If you aren't happy with taking time to learn the correct way to handle your problem, take a look at

Read the whole thread. See if you can figure out how to use this hack technique in your circumstances.

HTH

a7

1 Like

Welcome to the forum

The proper answer is not to use a delay() in the first place, rather use non blocking code so that other things, such as a button press, can be detected and acted upon

See Using millis() for timing. A beginners guide, Several things at the same time and the BlinkWithoutDelay example in the IDE

1 Like

Just sayin'


("Canned" reply:)
That is not what interrupts are [b]for[/b]!

As a beginner, it is incredibly unlikely that interrupts will be useful to you.

A common "newbie" misunderstanding is that an interrupt is a mechanism for altering the flow of a program - to execute an alternate function. Nothing could be further from the truth! :astonished:

An interrupt is a mechanism for performing an action which can be executed in "no time at all" with an urgency that it must be performed immediately or else data - information - will be lost or some harm will occur. It then returns to the main task without disturbing that task in any way though the main task may well check at the appropriate point for a "flag" set by the interrupt.

Now these criteria are in a microprocessor time scale - microseconds. This must not be confused with a human time scale of tens or hundreds of milliseconds or indeed, a couple of seconds. A switch operation as often considered, is in this latter category and even a mechanical operation perhaps several milliseconds; the period of a 6000 RPM shaft rotation is ten milliseconds. Sending messages to a video terminal is clearly in no way urgent,

Unless it is a very complex procedure, you would expect the loop() to cycle many times per millisecond. If it does not, there is most likely an error in code planning; while the delay() function is provided for testing purposes, its action goes strictly against effective programming methods. The loop() will be successively testing a number of contingencies as to whether each requires action, only one of which may be whether a particular timing criteria has expired. Unless an action must be executed in the order of mere microseconds, it will be handled in the loop().

So what sort of actions do require such immediate attention? Well, generally those which result from the computer hardware itself, such as high speed transfer of data in UARTs(, USARTs) or disk controllers.

An alternate use of interrupts, for context switching in RTOSs, is rarely relevant to this category of microprocessors as it is more efficient to write cooperative code as described above.


So - don't use "delay()"!

:+1:

Also,
Interrupt mechanism is helpful for responding to a request that happens rarely. ISR routine should contain codes as minimum as possible for attending to other interrupting devices; else, interrupt logic could be enabled after entering into the current ISR to implement nested interrupt to serve higher priority interrupts.

It's possible, I think even likely, that the OP didn't mean the "computer meaning" of the word interrupt which is very specific in the microprocessor world. They might just have meant it in the more general English sense of the word, as in to break in to the flow and change what's being done. In that sense, polling for a pin to change and then do something different is an interruption, although not an interrupt :wink: It's easy to lose sight of the fact that many words that have specific meanings in, and seem to be owned by, certain disciplines actually have older less specific meanings in plain ol' English.

There is no way to "quit the executing of the delay" beside a reset.
The delay will last the given periode.
There are two ways to do things during the delay (one called interrupt service routine, the other is a more "Arduino like" way using yield), but neither will be recommended for a beginner.
The clean solution is not to block the code with delay(), but use principles like the example "blink without delay".
Using millis() like in "blink without delay" will not block your code, you can simply check for your button press and do what ever you want in parallel to you your "waiting time".

You also might consider to check the example "Debounce" and "State change detection"

See this example :

Oh, I am more than happy in knowing the right way to do things because I come from software side and because of the heavy code we write I would always want to learn ways to organize properly and optimize the code. But I am not very acquainted with embedded system development thought process. I am just trying to implement a routine for wash and drying cycles of my broken washing machine so that it would help my mom in her tasks. So you see my thought process would be different compared to you guys as you people are at the highest level of code optimization due the inherent constraints in embedded devices.

thought so, thanks for the reply.

this would be the same principle.
Use a finite state machine, switch states after millis have passed or on button press. There is nothing mysterious with FSMs on microcontrollers if you come from other software areas.

1 Like

Thanks for the explanation and I would try to use code without interrupts. I come from a software background and polling a pin to maintain a relay's state would not be the approach I would come to take automatically because I already know the time required to put the relay in a certain state. I would go for delay to countdown the time required to keep the relay operational. But as suggested by you and others I would now definitely try other approaches. The reason I asked the question above was to know if interrupts can be used to make the system halt and change the code flow as this would help us in getting a real time response and there by giving us absolute control.

Will try it.

Ok so ignore my #8, where I was wondering if you were using the word interrupt in its shall we say "computer sense" or just in a more general "English sense"; clearly you meant the former.

With delayless timing and not using other blocking constructs like while() you certainly can get...

... or as near as dammit anyway.

a cheap button can bounce for several milliseconds and we have to take measures against bouncing, because the microcontroller reads so fast, that we see this bouncing.
So for me it makes very less sense to stress the next buzzword "real time" - while talking in the context of a human operated switch. A button can be read by polling easily without loosing acceptable reaction time if the rest of the sketch doesn't block. And yes, you could even debounce the button with a reasonable "blocking" delay and your mother will not recognize it as bad reaction time. Forget the term "real time" for the current sketch or define your "real time".

The following sketch demonstrate that the execution phase of the delay() function can be paused by interrupt; it happens due to the fact that INT0-interrupt has the higher priority than the TC0-interrupt used by delay() function.

void setup() 
{
  pinMode(2, INPUT_PULLUP);
  pinMode(13, OUTPUT);
  attachInterrupt(digitalPinToInterrupt(2), ISRZ, FALLING);
}

void loop() 
{
  digitalWrite(13, HIGH);
  delay(5000);
  digitalWrite(13, LOW);
  delay(5000);
}

void ISRZ()
{
  digitalWrite(13, LOW);
}
1 Like

Good point... I was actually tempted to make my comment in #15 to have been:

on my Arduino it demonstrates only that you can read the pin and the LED gets switched. Button presses, button holds don't have a measurable pause effect on the execution of the delay:

The delay stays in it's interval:

void setup() 
{
  Serial.begin(115200);
  pinMode(2, INPUT_PULLUP);
  pinMode(13, OUTPUT);
  attachInterrupt(digitalPinToInterrupt(2), ISRZ, FALLING); // don't use this to read buttons
}

void loop() 
{
  digitalWrite(13, HIGH);
  Serial.println(F("HIGH"));
  delay(5000);
  
  digitalWrite(13, LOW);
  Serial.println(F("LOW"));
  delay(5000);
}

void ISRZ()
{
  digitalWrite(13, LOW);
}

10:09:17.037 -> HIGH
10:09:22.043 -> LOW
10:09:27.038 -> HIGH
10:09:32.005 -> LOW
10:09:37.034 -> HIGH
10:09:42.021 -> LOW
10:09:47.010 -> HIGH
10:09:51.997 -> LOW

so what ever you wanted to demonstrate, but the TO asked for "quit executing the delay function". If you think an ISR is a good advice, just let me know and I will jump out of this thread.

Yeah you are right, coming from a application development background it is easy to underestimate the speed of a microcontroller. However I wanted to practically stop what ever the microcontroller is doing and through my question I just wanted to know if an ISR can put an end to the internal working of a function if it was already executing and loaded in ram ( please don't ask which one, I meant the ram as in a pc ram by analogy). Using debounce and non-blocking delay are things I am hearing for the first time which I intend to implement. Although my primary thought was to come up with a solution to help my mom I also wanted to get more understanding and knowledge regarding embedded development. Also coming from an application development background I was more concerned if it was a good thing to repeatedly check for input when I would rather want to consider when and if there is one, something similar to push notifications in web applications and mqtt protocol. This is why I started to think of using interrupts. Thanks for all the support.