strange interrupt/delay problem

Hello all, thanks for taking a look at this thread. I've been banging my head on this one for the last 2 days!

Everything used to work, and then the arduino and a lot of other electronics blew up because I hadn't protected the circuit from over voltage conditions properly.

I'm fixing things now, but the same code no longer works on 2 arduino boards, both running 0012 which was what was on the (now blown) chip. What happens is very weird. The delay() no longer delay, and the program is executing parts of an if statement whose conditions are not being met!

It is a relatively complex program split into 4 libraries and the main loop. I have an interrupt routine off timer2 overflow that services:

  • ADC on 4 channels
  • some calculations
  • writing data to a serial LED driver

The ISR is called every 10ms, and the above code takes about 1ms to execute.

If I insert a

Serial.println( millis() );

into the ISR,

and a

delay( 50 );

into the main loop, then the program works as expected! Removing either of these 2 lines causes unexpected behaviour.

Does anyone have a handy cluebat?

Thanks,

Matthew

smashed 4.3.0 compiler. Upgraded to 4.3.2 and all sorted. Thanks for looking!