Arduino Uno run time

Im trying to read how much time the Arduino takes to read or execute a line of code.

Here is my code:

void setup() {
  // put your setup code here, to run once:
Serial.begin(9600);
val = analogRead(pin);
int val2 = val*5/1024;
int val3 = val2-2.5*(311);
int val4 = val4 + val3*val3;
Serial.println(micros());
}

Serial.print always shows 216 microsec.
if I only execute the analogRead() function, it also shows 216 micro sec.
if I run only these line of code:

int val2 = val*5/1024;
int val3 = val2-2.5*(311);
int val4 = val4 + val3*val3;

it only takes 8 microsec to execute. This doesnt add up. Im suspecting it is affected by the baud rate. Can someone help me of how much time the arduino takes to read a single line of code only and a the analogRead function?

store off the current micros before the line, execute the line, then deduct the stored micros from the current micros. This will give you the delta micros, which should be the time taken to execute that line (roughly :slight_smile: ).

Unused code and variables will be removed by the optimizer.

It seems it does a good recursive job.

(val4 would contain garbage anyway it is not initialized properly.)

I tried this code:

void setup() {

Serial.begin(9600);
t1 = micros();
val = analogRead(pin);
t2 = micros();
Serial.print(t2-t1);

}

im getting 208 microsec. What is the math behind this? im still confused

Which part are you confused about?

If its how this works, then:

micros returns the number of microseconds passed since the program starts. You take 2 sample points, by subtracting one from the other your are left with the time taken to execute the code between these points.

If its about why your micros don't seem to add up, then it could be dead-stripping, but most likely micros aliasing. Depending on your board it will return multiples of either 4 or 8 micros, whether this is rounded at all, I'm not sure (Id guess at truncated, so always rounded down).

First, is my code right by reading how much time it takes to read an analog value?
Second, if yes then why does if show 208 not 100 microsec because what I have read in here https://www.arduino.cc/en/Reference/AnalogRead it says that "It takes about 100 microseconds (0.0001 s) to read an analog input, so the maximum reading rate is about 10,000 times a second."

Looks fine to me. You'd have to look at the internals of micros() and analogRead() functions to check for any overheads. A quick easy test that might shed some light would be to run a tight loop of analogReads, say a 1000 times, and time that entire chunk. Divide the result by the number of iterations (1000) to get the time per call.

That should mitigate any overheads from the micros call.

  1. ADC - Analog to Digital Converter
    28.1. Features
    • 10-bit Resolution
    • 0.5 LSB Integral Non-Linearity
    • ±2 LSB Absolute Accuracy
    13 - 260μs Conversion Time
    • Up to 76.9kSPS (Up to 15kSPS at Maximum Resolution)

Page 305 of the 328 datasheet.

So your 208 µs see to be reasonable.

@Whandall: So it only depends on the conversion time?
Does it connect to the arduino uno's clock speed which is 16MHz?

Is it critical?

these guys managed to reduce the time taken (at resolution cost) by changing the prescaler:

link

The duration of analogRead depends obviously from the time to make a conversion.

analogRead selects the input to be sampled, start the conversion and wait for it to finish,
then returns the converted value.

How to use other modes and how the conversion time is connected to prescaler, resolution, etc.
you should look up in the datasheet.