Go Down

Topic: Reason for delay() after analogRead()? (Read 604 times) previous topic - next topic

SomeoneKnows

I'm trying to get a better idea of why things are done the way they are. Specifically how the amount of delay time after using an analogRead function is determined.

I have copied some code where a value is read using the analogRead() function then the author has a delay(10); with the comment "wait 10ms for ADC to reset before next reading".  

The Arduino Reference for analogRead() states that it takes about 100 microseconds (0.1 millisecond) to read an analog input. Is there a reason why I should use any larger delay than 0.1 ms?

The other reason I would expect to cause the delay is from the device sending the signal to the pin. In my case I'm reading the Sharp GP2D12 Object Detector that uses an IR light detector. The datasheet indicates:
-      The LED pulse cycle duration is 32 ms
-      Typical response time is 39 ms.
-      Typical start up delay is 44 ms.
From these numbers it would seem the 39 ms response time would be the delay time to use between analogRead() functions.

I'm trying to get a better understanding of where to pick out the important information from the datasheets and find accurate reasons on why the delay is used other than blindly following someone else's code.

Vince

Go Up