Hi, I am working on a project and I need to design a circuit which gives digital logic-1 for 100 nanoseconds then it is logic-0 for 10 microseconds. This should be a loop. I wrote a code, but I am not sure it works clearly
here is the code
unsigned long s1time, s2time;
void setup(){
Serial.begin(9600);
pinMode(8,OUTPUT);
}
This function works very accurately in the range 3 microseconds and up. We cannot assure that delayMicroseconds will perform precisely for smaller delay-times.
An exact 100ns delay is not going to be possible with a regular arduino board. You could get close, but you'd probably have to use some assembly code, not the arduino language.
Even then, the closest you could get is 2 cycles, which is 125ns.
Not only that, but you realize that digitalWrite() itself takes much more than one cycle to complete. In fact, your code is spending much more time in all the other functions than in the delayMicroseconds function calls. That's why you'd have to use AVR assembly instead.
If you really need 100 nanoseconds exactly, I'd look for some kind of external chip to do it for you.
As for the 100nS delay, forget it with an Arduino, just maybe if you clock it at 20MHz and get down and dirty with some assembler, but otherwise no chance.
This function works very accurately in the range 3 microseconds and up. We cannot assure that delayMicroseconds will perform precisely for smaller delay-times.
Actually, this just might be possible with the proper settings for PWM. You'd have to mess around with the internal registers to get it to run at the right frequency though.