Yes i thinked also in this but even continue having a gap of 0.666 microsecs. ![]()
the frequency of the crystal will shift due to temperature differences.
Please provide a working sketch, describe how and what you have measured, and what you have expected instead.
That doesn't tell us anything. 0.666 µs is very little with a period of some ms, but a really a lot with a periode of some µs. And how did you measure?
Its true, i know, and im adjusting at 25º, like the base resistance of an NTC probe.
62.5ns is for a 16MHz clock frequency, mguelyk is asking about an 8MHz clock so 125ns.
putting a digitalWrite at start and at end of wat i need to measure.
i just need to use oscilloscope at pin 13, then i can see how much try from end of the peak of the last pulse of 1st digitalWite to start of the 1st pulse of 2nd digitalWrite.
But imagine how strong must be measuring measuring and measuring in that way.
Is a normal practice that use all that work with programing if need to sync or adjusting times (if have an oscilloscope) but is really hard.
No, it doesn't answer the question. I guess there is a language translation problem here also. I'm sorry. Good luck.
I'm not really sure what your question is about. But if it is how the delayMicroseconds() function is implemented you may have a look
here https://github.com/arduino/ArduinoCore-avr/blob/master/cores/arduino/wiring.c
#elif F_CPU >= 8000000L
// for the 8 MHz internal clock
// for a 1 and 2 microsecond delay, simply return. the overhead
// of the function call takes 14 (16) cycles, which is 2us
if (us <= 2) return; // = 3 cycles, (4 when true)
// the following loop takes 1/2 of a microsecond (4 cycles)
// per iteration, so execute it twice for each microsecond of
// delay requested.
us <<= 1; //x2 us, = 2 cycles
// account for the time taken in the preceding commands.
// we just burned 17 (19) cycles above, remove 4, (4*4=16)
// us is at least 6 so we can subtract 4
us -= 4; // = 2 cycles
and further information (but not verified ...)
here https://electronics.stackexchange.com/questions/84776/arduino-delaymicroseconds
Im immersed in a counter and i want the precision of an athomic clock or in default the most possible precision.
Well that counter have a gap (desync) of 0,666 microSec, that is a lot time of desync, and cannot reduce these 0.666 microsecs because the less that i can use is delayMicroseconds(1); and is too big for 0.666 microsecs, if i use then i can have a delay of 0,333 microsecs but is after time check and i prefer that be before time check dont using delayMicroseconds(1);
For this im thinking to use neutral commands or funtions operations etc, like seg=seg; or X=X; implemented in the routine to make a bit delay but they are i suposse around 50ns each, so i need a looot of these neutral operation in code, this is why im searching the way to reduce the desync without use neutral operations.
But unknow the way.
There is an assembler instruction called "No Operation" nop ...
asm volatile("nop\n\t");
which I used in another thread for an attiny85
https://forum.arduino.cc/t/attiny85-108khz-square-wave-50-duty-cycle/1160510/4?u=ec2021
to provide a signal of (about) 108 kHz.
I just realized that creating a counter based on the internal timer1 does not affect the delay functions or anything it does.
I just tried delayMicroseconds(1) and had the same desync, so i set delay(5000); and there is still a gap of 0.666
Because it is an electronic internal timer based on a 8Mhz Crystal.
Is just affected by the Hz set at OCR1A.
So the only way is modify the timer1 values.
![]()
ISR(TIMER1_COMPA_vect) {
delay(5000);
TCNT1 = 0; // ISR function
N++;
}
For 8MHz, consider CTC mode with OCR1A = 0:
...which resets TCNT1 in hardware without software induced delays.
And also consider a crystal for your clock
It's all useless here. As long as you don't explain exactly what you are doing and how you measure this ominous 0.666µs gap (measurement setup and sketch), it's all just guesswork. If you keep this as a secret, nobody can really help.
Also the 0.666 microsec is using as i commented previously using an oscilloscope at pin 13 with 2 digitalWrite at start and at end, in this case at start when return to routine after N++ and the another gigitalWrite get N=0 at end of the routine, the end of a complete cicle of 1 sec,, the difference is 1 seg+- but really is 0.999999334 secs
1sec - 0.999999334 sec = 6.66e-7 = 0.666 microsec.
that is the same than 1 sec desync every 17 days 9 hours 5 min 1,5secs.
Measuring the difference with the oscilloscope in a flank of 1,0 sec have a bit difference, from 0.999999334 secs, this bit difference of 0.666 microsec is the desync that make every sec.
Please keep it friendly people!
Thank you.
@miguelyx
ah - no. I changed my mind.
best regards Stefan
This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.
https://onlinedocs.microchip.com/pr/GUID-0EC909F9-8FB7-46B2-BF4B-05290662B5C3-en-US-12.1.1/index.html?GUID-1EB44C0C-12C2-45B2-8B32-CEE5963C9F73