Greetings.
I am using an Arduino MEGA (1280) for a video project.
On pin 2 (INT0) I have vSync, a negative going signal which lasts for > 160usec and repeats every 20 msec.
On pin 20 (PIND0) I have hSync, a negative going signal which lasts for just 5.375 usec and repeats every 64 usec.
hSync is alternately high and low when vSync asserts, corresponding to even and odd fields in the video stream.
When vSync asserts, I check hSync using the fastest method I have available:-
void INT0ServiceFALL() {
//check Hsync when Vsync drops.
//If Hsync = LOW, then ODD FIELD (f1)
//If Hsync = HIGH, then EVEN FIELD (f2)
//note this has not worked due to delay in servicing this interrupt :-(
//Hsync is low for just 5.25usec
portDinput = PIND;
vSyncTimeStamp = micros();
portDinput = portDinput & B00000001;
vSyncDetected = true;
}
where portDinput is a volatile byte defined as global (i.e. in the header before void setup()
Most of the time (say 199 / 200 times) I grab hSync successfully by this method. The interrupt service routine runs in under 5 usec from trigger and all is well. The 200th time, the ISR takes 6.25usec from trigger to run, and falsely reads hSync as high rather than low.
Is there any way to speed up the interrupt handling execution?
I use INT0 as it would appear to be fairly high in priority. I have micros() called in the code which I kind of need, but disabling it does not reduce the incidence of slow INT0 handling.
INT0 is invoked in the usual way:-
attachInterrupt(0, INT0ServiceFALL, FALLING); //Vsync
If there is no way to reduce the overhead in calling an interrupt handler, then I will probably go with an averaging algorithm which predicts what hSync ought to be, on the basis of several successful tries, and over-rides it if the fetch is bad. However this complicates what ought to be an elegant solution.
If any knowledgeable person has any thoughts on the subject, please feel free to let me know.
Regards, Tony Barry