Go Down

Topic: A bunch of questions about playback sampled sounds (Read 9193 times) previous topic - next topic

AWOL

. . . but it's a very sImple function to take apart - most of the 110us is a busy wait.

Grumpy_Mike

Quote
You mean that I have a kind of "time budget" to do other things without compromising the sample rate, right?
Right.

Quote
What kind of noise?
Time quantisation noise sometimes called sample noise. It is noise, that is signals that do not belong in there.

There are ways round the default analogRead, like to set the A/D in free running mode. But do it right and use interrupts, and let the other stuff that can wait, wait.

Lucario448

There are ways round the default analogRead, like to set the A/D in free running mode. But do it right and use interrupts, and let the other stuff that can wait, wait.
How exactly can I do that? Do you mean that the "free running mode" will take less CPU cycles than AnalogRead function? But how can I retrieve the measures?

And what about the map function? How long it takes?

Grumpy_Mike

Quote
Do you mean that the "free running mode" will take less CPU cycles than AnalogRead function?
No. But it will not block your program while you are waiting for the conversion to compleate.
Quote
But how can I retrieve the measures?
From the analogue conversion register.

You can also make the conversion faster at the expense of a bit of precision:-
Code: [Select]

// set up fast ADC mode (Put in the setup function )
   ADCSRA = (ADCSRA & 0xf8) | 0x04; // set 16 times division


Quote
And what about the map function? How long it takes?
No idea, but it is only a bunch of arithmetic statements for those who can't remember kindergarten maths.

Lucario448

You can also make the conversion faster at the expense of a bit of precision:-
Code: [Select]

// set up fast ADC mode (Put in the setup function )
   ADCSRA = (ADCSRA & 0xf8) | 0x04; // set 16 times division

And which value should I put in it if I want just 8 bit precision, instead of the default 10 bit?

Grumpy_Mike

There is noting that returns an 8 bit value, you just have to use the 8 most significant bits of the returned value. Shift right by two places is the way to do it.
>> is the shift right operator.

Lucario448

you just have to use the 8 most significant bits of the returned value. Shift right by two places is the way to do it.
So do you mean that the ADC always return a 10 bit value?
If so, please explain me a bit more that line of code that sets the analog input pins in "fast mode", how to retrieve the data with this new configuration and how to discard the two bits.

Grumpy_Mike

Quote
So do you mean that the ADC always return a 10 bit value?
Yes.
Quote
If so, please explain me a bit more that line of code that sets the analog input pins in "fast mode",
The A/D works with a clock that controls the timing, that line alters the pre scaler division ratio so that the clock runs faster and so the A/D runs faster.
For full information see section 23 of the ATmega328 data sheet.

Quote
how to retrieve the data with this new configuration and how to discard the two bits.
Exactly the same as before, with an analogRead function call. I told you about the shift operation to remove the lower two bits:-
Code: [Select]

eightBitValue = analogRead(0) >> 2;

Lucario448

Is there a way to measure the time that a funcion takes to execute? (in clock "ticks"). Because, I don't know exactly how long the AnalogRead function takes in the "fast mode".

And I found a problem by discarding the two bits. For example: if the 10-bit value is 256, 512 or 768; discarding the two most significant bits will result in a 8-bit value of 0 (zero), in all those three cases.

I want to clear all my doubts before put anything in practice...

AWOL

Quote
And I found a problem by discarding the two bits. For example: if the 10-bit value is 256, 512 or 768; discarding the two most significant bits will result in a 8-bit value of 0 (zero), in all those three cases.
Why on Earth would you discard the most significant bits?

Grumpy_Mike

Use the micros timer for timing function calls.

That code discards the two least significant bits not the most. Are you sure you understand what a shift operation is doing?

Lucario448

Use the micros timer for timing function calls.
Sorry for my ignorance, but how can I do that?
I mean, I want to execute a function, and then print the amount of ticks that the function took, to the serial monitor of course.

Quote
That code discards the two least significant bits not the most. Are you sure you understand what a shift operation is doing?
Whoops, my bad. You mean the LEAST significant bits, not the MOST ones. Very well, it works for me then...

Grumpy_Mike

You use micros to set a variable before you go into the routine.
https://www.arduino.cc/en/Reference/Micros

When you come out of it you subtract the current value from the one you stored to get how long it has taken.

Then you print out that number.

Lucario448

What does mean "4 microseconds resolution"?

There is actually 4 microseconds between every micros count?
Or, the micros count has an accuracy of 4 microseconds? (+- 4 microseconds)

Grumpy_Mike

Quote
What does mean "4 microseconds resolution"?
It means that for any given number there may be a 4uS error + or -


Go Up