I'm experimenting with using a 328-based Nano 3.0 as an audio processor, and have written a pretty good chorus/flanger but want to experiment with more CPU intensive stuff. As I do more in my loop, it takes longer to complete and the sample rate drops below the threshold I'm currently sitting on below which even analogue filters can't clean up the sample artifacts. At the moment, my program uses the standard analogRead function to collect the input sound, but as I understand it, the ADC takes a few moments to stabilize or something and the analogRead function will block until the ADC flags a register saying it's ready for reading, which is wasted time that could be spent processing the previous sample.
Some people have used interrupts to control/read the ADC, but as they then busy wait in the main loop(), I have to question the wisdom of this. I'm sure there's a reason for it (some limitation of the AtMega I'm unfamilliar with or something) but surely you'd save a lot of hassle using only loop() in the following manner:
Take value from ADC.
Tell ADC what to do.
Process the value you took.
Output the result.
Wait until ADC is ready.
(obviously the ADC would need to be told what to do and waited for in setup())
But this wouldn't need any interrupts, would maybe use less CPU power and would have the advantage of doing our processing while the ADC is working. Can anyone explain this please?
The reason for using interrupts is if you would then want to do even MORE things. Your interrupt code can do the processing, leaving the loop() (main) code for "other stuff". For example
Interrupt routine:
read the A/D sample (you got an interrupt so you know a sample is available)
start the next sample process
do your audio processing on the current sample (chorus, flange, whatever)
return from interrupt
Main code (loop function):
handle button presses
write to an LCD
etc.
--
The MegaRAM shield: add 128 kilobytes of external RAM to your Arduino Mega/Mega2560
Ooh, one more thing along the same lines - I looked at some examples, and one incremented, then decremented, incremented and decremented again an unused variable between changing the ADMUX and starting the conversion, with the comment "short delay before start conversion". Is this necessary? I can't find any evidence that it is.
Officially, there is some funny business regarding changing the channel in the middle of a conversion (see Section 23.5 of the ATmega328P datasheet for example) but I can't see the need to put in a delay if a conversion is already finished, you change ADMUX, then start a new conversion.
--
The Flexible MIDI Shield: MIDI IN/OUT, stacking headers, your choice of I/O pins
You can adjust the clock prescalar to the ADC.
Setting it to a lower value will result in faster analog reads, but less accurate. I think it is normally set to 128, the highest setting. Going down to a prescalar of 16 makes it faster, with not too much inaccuracy. 32 and 64 are options, too. I think that the inaccuracy won't affect your project much, it won't be noticeable in an audio signal. (as opposed to say, a potentiometer reading, which you expect to be consistent when the pot is not moving)
ADCSRA is the register with the prescalar select bits, ADPS2,1,and 0.
100 - 16
101 - 32
110 - 64