1- Transmit a pulse of radiowave at Time A
2- Receive that pulse at Time B
3- Calculate time difference = TIme B - Time A
4- Calculate Range based on that difference.
I don't think resolution will be an issue. I think the issue would be actually receiving a signal at a reasonable distance.
Light travels ~ 3.3 M/ns I don't see how any of the devices you will require will have a low enough latency to make a useful measurement in a direct manor.
I can't be sure but I expect GPS devices had ASIC that resolve the satellite signals without a processor being in the middle.
Oh, according to what theory? Can that processor, or a Nano processor as you have suggested, complete an entire RF ranging operation in a single clock cycle? Just because the CPU can twiddle a few bits in that time, doesn't at all mean it can send commands to send and receive, and interpret RF echoes in that time.
Of course there can't be a nanos() function, even though it wouldn't help you for the reason stated above. The MCU internal timers that micros() and millis() utilize for timing, don't run even close to that fast. Also there is a few microseconds (hence a few thousand nanoseconds) overhead to any function call, even disregarding any work that it has to do.
Practice, not theory. Internal counters/timers typically run up to the controller clock frequency. For the timing of external signals kind of the ATmega input capture should be available, that holds the timer value when a signal arrives. Then the real-time aspect is finished and post-processing can take as long as it takes.
That only allows you to produce and capture simple digital pulses on specifically equipped I/O pins. A hypothetical "nanos()" function would time program events only, just as micros() and millis() do now.
You can ignore for a moment the possible timing resolution issues of the microcontroller, the main issue is how would you know how to stop the 'reception timer' ?
The OP has not revealed any details of what they are actually trying to do in practice (another hint!), so we are guessing.
How would you accuratly time stamp the arrival of some sort of signal from a remote transmitter ?
The first bit of noise or some form of data decode that takes many kilometers to carry out ?
Hey srnet, thank you for suggesting the LORA transciever, will look more into it.
As for the project you asked, it is to map drone range from ground. Most modern radars rely on FMCW to process range but we are want to try pulse radar model as it is cost friendier.
Right now experimenting with HB100 doppler module, but that is one module out of many. Since it is doppler, only motion objects will be detected.
If you send 10 Ghz you only recieve back the same 10Ghz which goes through a mixer and bandpass filter and final output is the Intermediate Frequency, a small band something like 1-400 Hz.
If there is anything other than 0 than that is the doppler effect.
Now the idea here that we want to mod is if we pulse ON the transmit signal for 10 nanos and an object is detected, then it will echo return 10 nanos IF signal. Calculating the time between the beginning of the the transmit and recieve will solve the range.