Delay in nanoseconds

Hello everyone:
I have a problem and need to generate a delay of 12.5 microseconds, but I have not found a way to do it.
DelayMicroseconds function, type only accepts integer numbers, so I can only mean a delay of 12 microseconds, when I draws 12.5 microseconds, ie 12 500 nanoseconds.
anyone knows how I can do this??
greetings and thanks! :slight_smile:

You could write 200 NOPs @ 62.5ns each, assuming a 16MHz clock (minus overheads).

Problem is that small delays like that are not well suited to being written in C. Also with interrupts going off at random times a delay that small is likely to be much longer sometimes. You can always write a small tight loop with an increment inside and use an oscilloscope to measure how many times you need to go through it to get your delay but it all depends on your exact application.

thank you very much ;D , and I could adjust the output using fastWrite and asm ("nop \ n \ t").
but now another problem arises for me, as I can control the timing of a signal, a digital pin?
Because the function Micros (). Do not let me use decimal
thanks for your answers.

Because the function Micros (). Do not let me use decimal

Do you mean fractions?

It’s outside the Arduino environment, but the way I would handle this is to setup an interrupt to occur 12.5us in the future using the TimerN Output Compare registers.

Pseudo-code…

when(event happens)
{
store current timer value
set output compare register at current_time + 12.5us
enable interrupt
}

output_compare_interrupt()
{
do something
disable interrupt
}

hello all, thanks for the responses.
I need to measure the duration of an event when it begins and when it ends.
This event can be variable, from 12.5 microseconds to 14.7 microseconds.
and the measurement of the length of time, has to be accurate.
With a resolution of 0.1 microsecond.
thanks and greetings

I need to measure the duration of an event when it begins and when it ends.
This event can be variable, from 12.5 microseconds to 14.7 microseconds.

That's nothing to do with the problem you posed in the original post.

the measurement of the length of time, has to be accurate

Define "accurate".

Indeed the problem has changed considerably from the original post.

With a resolution of 0.1 microsecond.

Can't do that on an arduino it only has a 16MHz clock so there is not the time to sense an input, compare it to a level you want and stop a clock. The accuracy of 0.1uS corresponds to a 10MHz clock so to tie to that accuracy you need at least a 50MHz clock on your processor. Time to redefine what you want or move on to another processor.

thank you very much for the reply.
I will think about the option of how to measure the duration of the pulse signal.
Sorry for the deflection of the original message

Is this pulse repeatable, with the same timing between each pulse?

I’m thinking you could use a trick used to time how long functions in a program with very short execution times take to execute.

In that case, you record the start and end times of the function hundreds of times, and then average the results.

The way this works is if the function were to take 0.5 milliseconds for example, and your timer only had a resolution of 1 millisecond, then half the time you checked how long the function took, you’d get 0 milliseconds (cause the timer didn’t have a chance to tick over), and half you’d get 1 millisecond. Averaging those out then gives you the real timing.

So if you’re trying to read the speed of a motor or something and you’re using a photodetector to do it, and trying to read those pulses, and you can read a whole lot of them, then perhaps there’s a way you can average your results to make it work.

I will think about the option of how to measure the duration of the pulse signal.

If you read up on the AtMega hardware timers, you will find they can be configured to run with a resolution of 1 cpu cycle (62.5 nano seconds on a 16MHz Arduino). This should allow you to measure time between two pulses accurate to within +/- 1 cpu cycle.

For the range you ask for (2.2 micro seconds) this would give you 32 steps (assuming a 16Mhz cpu clock).

An application like this however is way outside the comfort zone of standard Arduino.

Stop thinking in microseconds, and think in "clock cycles" or "nanoseconds" instead. You can easily find or write a piece of code that delays for 12500 nanoseconds (+/-62.5nS, +/- interrupts/overhead/etc) or set up a timer to count exact clock cycles, and then convert to your preferred units at the user interface level.

See for instance _delay_us in util/delay.h (which actually DOES take floating point arguments, but they MUST optimize away before it executes!)