Hi,
I'm new to this forum, but i have been working with arduino's for quite a while.
I have a question about the execution time of floating point commands. I found this old thread:
http://arduino.cc/forum/index.php/topic,40901.0.html mentioning the speed of execution and was wondering where those numbers where comming from...
I get different (very strange and confusing) results...
First i used the function micros() to time my operations, but i read that micros()'s resolution is 4µs?
For my second attempt, i'm using Timer1, with prescaler 1, on a duemillanove, so i should have 16 ticks / µs.
This is the code i'm trying to time :
void setup()
{
Serial.begin(9600);
TCCR1B &= 0xF8;
TCCR1B |= (1 << CS10);
}
void loop()
{
float fnumber;
float fresult = 0.0;
uint16_t time;
TCNT1 = 0;
fnumber = 50.0;
fresult = sqrt(fnumber);
//fresult = sin(fnumber);
//delayMicroseconds(10);
time = TCNT1;
Serial.print("delay: ");
Serial.println(time, DEC);
Serial.print("sqrt(");
Serial.print(fnumber, DEC);
Serial.print(") = ");
Serial.println(fresult, DEC);
delay(1000);
}
executing a sqrt() or sin() gives my a delta of 1, meaning 1/16th of a µs. This can't be right, but i can't figure out what i'm doing wrong...
Inserting a delayMicroseconds(10) function gives me (roughly) the correct delta of 156 (160 expected), so my timers seems to work correctly.
using the function micros() instead of timer1 gives me a delta of 4µs...
is an arduino really that fast in executing floating point calculations?