analogRead. When does it take the sample.

Hi all,
I understand that the analogRead takes about 100 usec. I've been able to successfully use Jmknapp's method to get the reads much faster (closer to 20 usec). Thanks for that!

But I'm doing some fine tweaking on the program and was wondering if anybody knew off-hand whether the sample actually takes place near the beginning of the analogRead function, or closer to the end, or somewhere in the middle. I'm recording the time of the sample and wanted to know if I should record the time just before, or just after the analogRead statement to get the time closer to the actual moment the signal was sampled.

Thanks

Before time stamp would be more accurate, as analogread "take a sample" first, and than function spending 120 usec trying to figure out value of it :slight_smile:

A little research on how a ADC works seems to be in order. An ADC charges a capacitor, using the input voltage, and a resistor to control current flow. It times how long it takes to charge the capacitor. The value returned by analogRead(), then, is a measure of how long it took to charge the capacitor.

So, the regular analogRead() function discharges the capacitor, measures the charge time, and returns a value. The fast analogRead function still needs to measure the charge time. So, the answer to the question of when does it take the sample is "while".

Not quite, you just describe how single-slope ADC works.
Datasheet:

The ATmega328P features a 10-bit successive approximation ADC.

Successive approximation ADC take a sample fast and hold it, than uses DAC to measure voltage.

The ADC contains a Sample and Hold circuit which ensures that the input voltage to the ADC is held at a constant level during conversion

And there is a link:

The authorative answer is in the datasheet http://www.atmel.com/dyn/products/product_docs.asp?category_id=163&family_id=607&subfamily_id=760&part_id=4198 section "Analog-to-Digital Converter". However it leaves the question how Arduino utilizes the ADC. This can be found out be looking into wiring_analog.c

int analogRead(uint8_t pin)
{
	uint8_t low, high;

#if defined(__AVR_ATmega1280__) || defined(__AVR_ATmega2560__)
	if (pin >= 54) pin -= 54; // allow for channel or pin numbers

	// the MUX5 bit of ADCSRB selects whether we're reading from channels
	// 0 to 7 (MUX5 low) or 8 to 15 (MUX5 high).
	ADCSRB = (ADCSRB & ~(1 << MUX5)) | (((pin >> 3) & 0x01) << MUX5);
#else
	if (pin >= 14) pin -= 14; // allow for channel or pin numbers
#endif
  
	// set the analog reference (high two bits of ADMUX) and select the
	// channel (low 4 bits).  this also sets ADLAR (left-adjust result)
	// to 0 (the default).
	ADMUX = (analog_reference << 6) | (pin & 0x07);

	// without a delay, we seem to read from the wrong channel
	//delay(1);

	// start the conversion
	sbi(ADCSRA, ADSC);

	// ADSC is cleared when the conversion finishes
	while (bit_is_set(ADCSRA, ADSC));

	// we have to read ADCL first; doing so locks both ADCL
	// and ADCH until ADCH is read.  reading ADCL second would
	// cause the results of each conversion to be discarded,
	// as ADCL and ADCH would be locked when it completed.
	low = ADCL;
	high = ADCH;

	// combine the two bytes
	return (high << 8) | low;
}

Good point.
If analogread is calling to ADC with start conversion as "first conversion", that probably the case with non-tweaked analogread, than I should correct myself, "sampling & hold" taking 54% of conversion time and value would be little bit closer to the end :~
In other two cases, "normal" and "auto-triggering" value is in the beginning.

Thanks for the replies.

I did a little test to see if I could measure it. However, to do the test I used digitalWrite (to control an RC charge/discharge). How much delay is there from the time digitalWrite is complete, to when the signal goes from high to low?

Not counting that delay, the values I get from when analogRead is called to when the sample is actually taken is about 15 to 17 usec. Using the fast ADC, that number drops to 2 to 3 usec.

Since 2 to 3 usec seems to good to be true, there must be a delay in the output signal change from when digitalWrite is complete to when the output voltage actually changes.

I'm not sure if I can devise a test to measure the digitalWrite delay.

Don't need to worry the digitalWrite will affect the output pin before it returns, setting a pin is one cpu instruction - the fast start up of the ADC is because the way it works is to multiplex the relevant pin to the sample/hold circuit, then sample and hold it - all within the first ADC clock or two. It takes at least 10 more clocks to do the successive approximation, one for each bit.

MarkT:
Don't need to worry the digitalWrite will affect the output pin before it returns, setting a pin is one cpu instruction - the fast start up of the ADC is because the way it works is to multiplex the relevant pin to the sample/hold circuit, then sample and hold it - all within the first ADC clock or two. It takes at least 10 more clocks to do the successive approximation, one for each bit.

Thanks. My latest experiment agrees. I did another experiment without digitalWrite, and got basically the same results. That supports digitalWrite happens very quickly, and the voltage changes by the time the digitalWrite statement is complete.

Using about 4 usec to compensate for the FASTADC analogRead delay, the samples time/voltage lines up with what I get externally on the oscope. (I also had a few other instructions involved, and compensated 4 usec for each of those as well).

I didn't confirm, yet, for the standard analogRead times.