In my code, I am using the TCNT3 for a timing of a function. Before the function gets called,
I make it equal to zero TCNT3 = 0; then at the end of the function, I read again the
timer_diff = TCNT3;
Now, I have the amount value of TCNT3 which is 28355, so the timer_diff = 28355;
My CPU runs at 20MHz and the Timer3 has a prescaler of 1024. In order to figure out the time it took the function to execute in seconds, I did the following:
1024 / 20.000000 = 0.0000512 seconds then I use this number to calculate
time = 51.2µsec * 28355 = 1451776 µsec and then I divide this number 1451776/1000 = 1451,776 milliseconds.
which is 1.46 seconds....
so if one function call takes 1.46 seconds then adding a counter which adds the value of the time each time function is called will yield the total time it took to call the function lets say 100 times then the total time will be 100 * 1.46 seconds = 145,1 seconds
but when I use my iPhone's stopwatch, or just calculate via calculator, and then compare it in serial
output I get a deviation of 1000 milliseconds, but should be 1450 considering above mentioned right....
any ideas about this hole concept are very much appreciated.
He's obviously using a classic AVR with one of the usual 16-bit timers (like Timer1 on almost all classic AVRs), probably a Mega, but possibly a 328PB or 1284P/similar.
That is a sound way to measure the time a function call takes, yes. That's how I usually do it (though I don't write functions that are that slow, what the hell are you doing?! When I am timing functions with a 16-bit timer, I'm prescaling by 1 or 2! Which is a good thing, because I'm usually doing it on a megaavr's type B timers, which don't have the lovely independent prescaler that the classic AVR 16-bit timers have).
It's not clear to me if you're reporting a difference of 1 second between measured time and what you calculate from that (if so, that's within expected error from a human pressing a stopwatch button, particularly on a phone touchscreen). Or that you're getting something radically different? Show the numbers involved.
it is the ATmega-1284p AVR. The TCNT3 is resting on Timer-3, which is a 16 bit one.
What bother me in my SERCOM output is the following:
Say I want to call the function X(), 100 times. I enter number and then, I get 100 invocations of that function x(). That is okay, as I posted above, each call x() takes a 1.46 seconds. So 100 calls will then be 100 x 1.46 = 145 seconds, and 145000 milliseconds. In fact in my output I see other value, which confuses me.
TCNT3 = 0;
t_0 = 0;
function(){
does its work;
}
t_1 = TCNT3; saving the actual value of TCNT3 in uint16_t variable
t_delta = t_1 -t_0; calculating the difference
time = (t_delta * 51) / 1000; // we get here time in milliseconds, which is then printed in SERCOM output