Go Down

Topic: time measure (Read 1 time) previous topic - next topic

erich

hello,
there no way to measure time more accurate then milliseconds.

of course one can make a loop with a counter. if we would know how many clockcycles this loop need
we would know the time.

is there a way to figure this out ?

best

erich
  

David Cuartielles

Hej Erich,

have you tried out "delayMicroseconds()" ?

/David

erich

i know what you mean - that i count a certain amount of delays but i guess i have to take also the cycles
of the "while" (test of the condition) loop into account
so it will be always more delay then the delay time -
or ?

erich

erich

to be more precice why i m asking this:
i wrote some code for the SRF04 ultrasonic sensor.
i have to mease the time lenght of the echo pulse which is relative to the distance of the obstacle.
of course if i use a counter (like now) i also have a relation for the distance, but if i have the duration of the pulse i can easlily already calculate the distance in f.e. cm without doing empirical research with the sensor.
erich

ps i will post the code when its clean

phred

I have the same problem as Erich. I'm using a couple "ping" ultrasonic sensors for a project.  Unfortunately, sound travels at approx 340 m/sec or 34 cm/millisec. If I can only accurately read time in millisecs, the usable resolution of the sensors drops to 17cm.

I suppose I could add a quartz crystal to the circuit and use that for timing, but if Arduino is able to set a delay in microseconds, why can't it read time in microseconds?



David Cuartielles

Hej,

I guess that Dave Mellis will fill in with more details in a while but the way we count time in microseconds is a tricky one (it is an assembler loop that assures accuracy to do things like DMX, serial, etc).

At this point I dont really know how much we can touch the libraries to do exactly what you want, but you can always use the trick of counting how long it takes to make an operation e.g. 1000 times, and use that as a trick to count.

The issues regarding time resolution have to do with:

a) the clock is working at "only" 16MHz (much faster than many commercial stamps), so that we count in tens of nanoseconds

b) the open source compiler is not 100% reliable when transforming code: e.g. 1 addition written in C takes more than 1 clock-cycle even if the processor has RISC architecture

Dave will probably add on this later.

/David

Go Up