Interrupt: Need to keep track of time spent in routine or ... ?

Let me start by saying I'm a beginner. That said, I've been doing lots of "Googling" and not found anything too encouraging. Maybe some folks here can give me some ideas.

Imagine that I have a program that is synchronized with external music or video. Basically, there is a video playing on a television and I want my Arduino to activate certain devices at specific times to match specific moments in the movie. However, in the event of an "accident", I want the Arduino to do something else entirely. For instance, during the movie, suppose a fire alarm goes off. I want this to be an interrupt that tells the Arduino to turn on the lights. For whatever reason, the movie can't be turned off or paused. After the fire alarm has been cleared, I want the Arduino to keep the lights on for a few more seconds and then return to it's normal program. Only now it is perhaps minutes behind where the movie is since the program was "paused" by the interrupt while the movie was still playing.

My exact application is a little different and the time spent in the interrupt is one the order of 1 second, but for synchronization, this is still an issue. I would like to find a way to keep track of how much time is elapsed during an interrupt.

About the only thing I could come up with was to write something that works like:

interrupt triggered -> turn power to devices off, set flag, return to program

And then, before each and every block of code that activates a device, I would include a line:

if (flag == true) {wait (time); turn power on}

Where "wait" is just a while loop that makes use of millis() and since the rest of my code is activated based on the value of millis(), this should be ok. The only loss of time is the amount of time it takes to write a pin low and set a flag. Based on my research, I can safely assume this will be less than 10 microseconds.

I'd still like a better way, though. I don't want to have to copy and paste a bunch of "if statements" into my code. For reference, my board is the Mega and, using some creative switching, I will be controlling almost 500 devices, which means I would need 500 "if statements".

Or one "if" statement repeated 500 times.

So is the "if" statement the best way to go? No simple solution for measuring the time spent in an ISR?

Part of the difficulty is that the number of devices and how they are activated changes each time the user uses the device. That means that making a loop is difficult because I may want to turn on many devices simultaneously or single devices at a time or ... etc. If I always turned on the devices one after the other, then it would be easy enough to create a loop that read in the millis() value and the device "location" from vectors and save lots of coding. I just haven't thought how to do it without limiting it's capabilities.

Also, I want to be able to control devices with a minimum delay between them, so I don't want to have to process a bunch of instructions before turning on the next device in case I want to turn a device on very quickly after the first. Maximum, I have 1 ms between turning devices on, which I think is plenty but I don't know.

I'm thinking I may actually write a macro to read in a text file with the time and device information that automatically generates the needed code.

... the time spent in the interrupt is one the order of 1 second ...

You shouldn't be spending a second in an interrupt. For one thing, your millis() counter is now wrong by a second.

I don't quite get what the problem is. You want certain things to happen a fixed time after the start of the movie, right? Regardless of whether or not the fire alarm goes off? This is just straight "do X at time Y" isn't it? You certainly shouldn't need 500 "if" statements. Maybe some sort of table with 500 entries in it. Bear in mind RAM is limited. Lots of things are limited on this processor.

Can the problem be stated another way?

Have a routine that is triggered by the interrupt.

When triggered it sets a "no output" flag for a time. After the time expires reset the flag. (I assume this is being called
multiple times in loop(), so it can check the time)

Have a common output function like:

void output_for_one_device(int device address, boolean no_output)
{
if (no_output)
{
set_output_safe();
return;
}
// otherwise, carry on and do the output

}

Do you really mean interrupt - an interrupt is for a hardware signal that you have to respond to very fast.
From your problem description, why not have a digital input for your interruption, and when you poll that
change your output variable? From my measurements on arduino code, the loop() repeats 20000 times a second,
so you can poll a digital input that fast. Interrupt routines are tricky.

... that was the whole point of question. I wanted to keep track of the time spent in interrupt so I could "fix" the millis() value.

I don't quite get what the problem is. You want certain things to happen a fixed time after the start of the movie, right? Regardless of whether or not the fire alarm goes off? This is just straight "do X at time Y" isn't it? You certainly shouldn't need 500 "if" statements. Maybe some sort of table with 500 entries in it. Bear in mind RAM is limited. Lots of things are limited on this processor.

The body of the program is "do X at time Y" but at any moment in time a fault could occur in the hardware. I can't give the specifics of what I'm designing, but there is a pretty realistic chance of having one of the devices (completely isolated from the Arduino) short out. This condition is measured as a voltage drop across a current sense resistor (0.01 ohms, 10 watts). Although the short is current limited by hardware, I want the Arduino to turn off the device that caused the short, wait about 100 ms for a PTC to cool (in the event that it trips - the current limiting should be taken care of via a very simply transistor/instrumentation amplifier arrangement) and then resume the program as though nothing happened (except that the shorted device was not turned on for it's full duration). If the shorted device was left on, I wouldn't be able to turn on any other devices since the voltage goes to 0 during a short.

So what's wrong with a function like what I wrote?
Is every hardware device driven by a different line of code?

tms8c8:
... that was the whole point of question. I wanted to keep track of the time spent in interrupt so I could "fix" the millis() value.

With interrupts disabled (as they are inside an interrupt) you can't keep track of time any more. Thus you can't "fix" the millis() value. The approach is fundamentally flawed.

From what you've described some sort of table of devices that are currently turned off would be all you require, and after X milliseconds elapse, you turn them on again and remove them from the table.

One approach that springs to mind BTW is just to use a RTC (real time clock) external chip. That will give you your "actual time" regardless of what the program is doing.

I was just reading and thinking about what you wrote, ShelleyCat :slight_smile: Didn't mean to ignore your post.

If I'm reading it correctly, you are suggesting the smart man's version of what I suggested: the interrupt sets a flag. I return to the body of the program but, with the flag, I can't output anything until a certain amount of time expires. The trouble is that many of the devices are controlled by different code.

[quote author=Nick Gammon link=topic=123886.msg931355#msg931355 date=1348274230]

With interrupts disabled (as they are inside an interrupt) you can't keep track of time any more. Thus you can't "fix" the millis() value. The approach is fundamentally flawed. [/quote]

That's what I was asking: is there a way to keep track of time through some sneaky manipulations or using the watchdog timer somehow. Evidently there is not.

From what you've described some sort of table of devices that are currently turned off would be all you require, and after X milliseconds elapse, you turn them on again and remove them from the table.

One approach that springs to mind BTW is just to use a RTC (real time clock) external chip. That will give you your "actual time" regardless of what the program is doing.

In essence, yes. If a short occurs, I need the Arduino to detect it, stop the program and shut off the offending device, wait a little bit, then start turning on devices as scheduled prior to the interlude.

Thanks for the suggestion on an external clock. I'll have to look into that.

tms8c8:
That's what I was asking: is there a way to keep track of time through some sneaky manipulations or using the watchdog timer somehow. Evidently there is not.

You can, up to a point. You could have Timer1 (the 16 bit timer) used as a timer. With suitable prescaling and counting modes set up, it could time quite a long period (eg. 8 seconds). There is a trade-off between resolution and the time before it wraps around.

But, I just don't like the idea. :slight_smile:

The time for an interrupt to start executing isn't exact, so your adjustment won't be precise. You are better off keeping interrupt handlers short (well under a millisecond) and doing what you want by keeping track of elapsed time.

I was wondering, why the need for an interrupt? If I understand the program, you are testing a number of devices, and then controlling them (on/off) based on timing. If so, why not place digitalreads() in the loop(), and use millis() for timing of each? Then your program becomes a simple test, then turn on/off - no interrupts needed. Unless you are doing a huge amount of processing, a digitalread runs in about 4 microseconds (~60 cycles), so you could do a hundred or so every millisecond, including processing overhead - plenty of time to test a lot of devices.

I suspect you are doing this:

delay(time1);
output_to_device_1;
delay(time2);
output_to_device_2;

Since you have not posted any code I am guessing.
Have you looked at the blink_without_delay example? This allows several 'things' to go on at once,
with the advantage that the processor is not hung in a delay() statement.

Your structure in that case could be something like this pseudocode,

// need a array of times to do things, what to do it on, which way to switch (on or off)
// a structure might be a good way to do this, you could probably get the (time, device address, operation) into
// ( a byte, two bytes, a byte), so for 500 devices would take 500 * 4 =  2K of program memory

void change_output_to_off(int current_step)
{
  // look into your sequence array, choose the device indexed by current step
  // and turn it off
}

void change_output(int current_step, boolean fault_found)
{
  if (fault_found){ return; }
  // look into your sequence array, choose the device indexed by current step
  // get the desired value,
  // do whatever address setting up you have to do for that device
  // change the value
}

setup()
{
   fault_found = false;
   current_step = 0;
   // other stuff, including recording starting time from millis()
}

loop()
{
if (!fault_found)
{
   fault_found = check_fault_state();
}
if (fault_found)
{
  change_output_to_off(current_step);
  if (time_elapsed_after_fault(millis())) 
  {
     fault_found = false;
  }
}
if (time_to_do_next_step(millis())
{
   set_up_delay_to_step(millis(), current_step); // sets up next place to synchronise from array of devices and times
   change_output(current_step); // this drives output if there is no fault
   current_step += 1; 
}

}

Sorry for the delay ... been busy with other things.

ShelleyCat - I don't use delay(). However, my code is somewhat more clumsy than than the pseudo code you posted. Definitely got my brain jump started on how to simplify things. Thanks!

David - My code was originally written to only loop once, so each device was coded into one loop. Rather than write the extra lines of code for each "event", I figured the interrupt would be easier. Also, I prefer the idea of an interrupt so that it can respond quickly to a short circuit situation. Part of my feelings come from the fact that I'm not very familiar with the Arduino or how fast it processes. I'm just now starting to figure out how many clock cycles are needed for things like digitalWrite()/digitalRead().

Nick - What everyone is saying here is starting to sink in through my thick skull and I don't like the idea, either! :slight_smile: