[quote]This piece of code, generates a pulse of 29.41kHz on pin 6 - far from the expected 0.5MHz ...
.5Mhz means 2u or 32 ticks at 16MIPS. Your isr latency will take 10 - 20 ticks and that doesn't leave much to execution. One way to check for latency is to eliminate all code other than the pin flipping statement and see if you gain speed. If you do, latency is the issue.
majenko:
One thing I can comment on though - are you taking into account the number of instructions taken to process the interrupt?
...
That's 26 instructions from the point the interrupt routine is triggered to when the next ADC sample is started.
You mean, I should write my code in such a way that the occurring number of instructions is lower? Unless I go down to the assembly level, I don't think this is possible. But, I have the feeling that there must be a way...
dc42:
Looks like you are assuming that the ADC only takes one clock do to a conversion, whereas it actually takes more (around 12 AFAIR - check the datasheet).
If this is the case, what is the prescaler all about? And what is the meaning of being able to sample every 1us (ie 1MHz) or less if it takes more just to get the result?
majenko:
One thing I can comment on though - are you taking into account the number of instructions taken to process the interrupt?
...
That's 26 instructions from the point the interrupt routine is triggered to when the next ADC sample is started.
You mean, I should write my code in such a way that the occurring number of instructions is lower? Unless I go down to the assembly level, I don't think this is possible. But, I have the feeling that there must be a way...
I don't know much about avr-gcc, but there might be attributes to the isr routine that control what gets stored in the stack and what doesn't - that could reduce the number of registers and such being saved and save some time.
Also, don't forget that the output pin will be running at half the frequency of the sample rate. You switch on with one sample, and switch off with the next sample, so for 0.5MHz sample rate you should be getting 0.25MHz pin frequency.
I think I understood your confusion now. The adc prescaler selects the adc clock (to be 1/16 in your case) but each conversion may take multiple (around 15) ticks of the adc clock to complete.
The ADC is a "successive approximation" type. Each tick of the ADC clock resolves the sampled voltage down to a higher resolution and accuracy. You can sample at a lower resolution, which requires less clock ticks to calculate that number of bits, and thus sample faster. You still can't get a sample at just one clock tick - for that you need a "flash" type ADC, and they cost big bucks. They're used in video systems and allow giga-samples-per-second.
dhenry:
I think I understood your confusion now. The adc prescaler selects the adc clock (to be 1/16 in your case) but each conversion may take multiple (around 15) ticks of the adc clock to complete.
OK, I got it now... In the document "AVR120: Characterization and Calibration of the ADC on an AVR" it is stressed that:
Since one conversion takes 13 ADC clock cycles, a maximum ADC clock of 1
MHz means approximately 77k samples per second. This limits the bandwidth in
single-ended mode to 38.5 kHz, according to the Nyquist sampling theorem.
But, the frequency I achieve is not 38.5kHz; it is 29.4kHz which means that either my conversion takes longer or that the directive PIND ^= 1<<PIND6; takes 4 cycles to execute. Does any of these hypotheses make sense?
Yes (I got 31khz) but that's for two samples (because you flip the pin for each sampling and two flips complete a period) -> sampling is done about close to 60khz. That's very close to the 77khz figure in the datasheet (without considering latency).
You mean, you tried my code and you got 31kHz? And I get 29.4kHz?! This is scandalous! I'll ask for my money back!
dhenry:
but that's for two samples (because you flip the pin for each sampling and two flips complete a period) -> sampling is done about close to 60khz. That's very close to the 77khz figure in the datasheet (without considering latency).
True. But, it's not that close to 77kHz. And I would like to fathom the reasons for this discrepancy. If I didn't have an oscilloscope for example, could I have predicted the actual sampling frequency at a good accuracy?
You code uses the conversion complete interrupt to trigger a new conversion. By then you've probably missed the boat and need to wait an extra ADC clock cycle or two to start the new conversion. I think you need to put the ADC in continuous conversion mode to achieve the quoted maximum conversion rate.
dc42:
You code uses the conversion complete interrupt to trigger a new conversion. By then you've probably missed the boat and need to wait an extra ADC clock cycle or two to start the new conversion. I think you need to put the ADC in continuous conversion mode to achieve the quoted maximum conversion rate.
Bingo! I switched to free-running mode and the frequency rose up to 38.46kHz!
Why do I get 29.4kHz on single conversion mode? :~ :0
Update...
OK, I figured out why I get a lower frequency... The signal I measure with my oscilloscope is noisy. Maybe, some capacity creates what is shown on the attached image.
After processing the data with Matlab, I also get 31kHz for the single conversion mode and 40.7kHz for the free running mode.
dhenry:
Hopefully by now you have figured out why the falling edge isn't so sharp.
Yes, I just figured it out. I had connected a LED (and a resistor) between the output of the pin and the GND smiley-lol. I removed the LED and now I get a nice square wave @38.17kHz.
Update: FIY: The highest achievable rate (wrt the generated pulse) in free-running mode was 121.4kHz.