I just got my Arduino Duemilanove last week and have been toying with it daily with no major confusion until today.
I set up 4 LED's in digital 4-7, a button on digital 3 and a pot on analog 0.
The pot is mapped to 0-16 and the number is output in binary on the LEDs.
The button should do a mini-knight rider effect, but it seems to crash at the call of delay(100); and not do anything until I reset.
Is there some rule against using a delay in an interrupt call?
If so, why does delayMicroseconds() work? (not good for me who needs a longer delay though)
int oldPot = 0;
int val;
int potVal;
int k;
void setup(){
DDRD = DDRD | B11110000;
PORTD = PORTD ^ B11110000;
attachInterrupt(1, krider, FALLING);
}
void loop(){
potVal = map(analogRead(0), 0, 1023, 0, 15);
if(oldPot != potVal){
PORTD = B11110000 & (~potVal << 4);
oldPot = potVal;
val = potVal;
}
}
void krider(){
for(int i = 0; i < 4; i++){
k = 1 << i;
PORTD = B11110000 & (~k << 4);
delay(100);
}
for(int i = 3; i >= 0; i--){
k = 1 << i;
PORTD = B11110000 & (~k << 4);
delay(100);
}
}
Any other tips and advice would be appreciated too
Could switch bounce be part of your problem. You trigger the interrupt, and then while you're in the delay, a second high signal from the button re-triggers the interrupt ?
Don't use the delay() function in an interrupt routine!
It relies on interrupts itself.
The delayMicroseconds() function in contrast uses a simple wait loop and disables interrupts before execution (and then restores SREG after execution), so you can use it in interrupt routines.
@Ray: Thanks for the tip
And I used an interrupt because I haven't before and thought it was a simple use for one.
I did have to modify the example you gave me to work because of the pins I'm using and had to keep the not in there as I hooked the LED's to come on when the pins go low.
@bohne: Ahh, that'd be why then!
I saw it said that "interrupts will work as they should." but I'm guessing that meant that while it's being delayed an interrupt will work during the delayed period and not the other way.
I also saw it said that more knowledgeable programmers would avoid the use of delay for anything over 10's of seconds. It'd be nice to know how, just for future reference?