latency measurement between optical flash and a "beep" signal

Hey Folks

I have a project going on to measure the latency between a flash popping up on my screen and the "beep" signal coming from a normal 3.5mm jack plug. My first approach would be measure the brightness of an area of the screen over a time signal and at the "same" time measure the currency at the jack plug for example. The internal latency from the processing time, i want to calculate from the code and the known processing times per command. The accuracy should be around 1-2ms. But is the measurement of the beep signal even possible with easy methods. I know theres a frequency problem but an amplitude peak would do it i think. So does anybody have an idea how i can measure the "beep" (and its a quite short beep) directly form the jack plug?

The final program should give me the latency so i can correct it manually



You need to find out the frequency of the beep.

If you want to use an analog input see Reply #4 on this post:

This program can measure an analog input at 40,000 samples per second.

ANALOG05.xlsx (3.42 MB)

MyAnalogBinLogger_A0_10_bits_ADC1000_40000_BPS.ino (23.7 KB)

Verry Nice! Thank Raschemmel!

Is it like an "instant" signal i get if the beep comes or does have the LM567 (or the whole circuit ) have an noticeable latency itself?

I could build you an engine but electronics… i need help for sure :D

SEE Last page of attached datasheet for frequency selection formulas

lm567.pdf (1.19 MB)

Lombaseggl: Is it like an "instant" signal i get if the beep comes or does have the LM567 (or the whole circuit ) have an noticeable latency itself?

I think you'll need to worry more about the "flash" than about the audio. Specifically, you will need to know exactly when the flash shows up at the point on the screen you are measuring. To do this, you'll need to know something about the screen itself... refresh rate, start of refresh, direction of refresh patter (left to right, right to left, top to bottom, bottom to top, etc.).

For example, on a CRT, running at standard TV frequencies, a half frame refresh takes 16.666... milliseconds, which is the vertical scan rate. Horizontal scan is 15,750 Hz, or about 63.5 microseconds. When you flash something onto the screen, it will not show up at your light detector until the sweep has brought the beam to it. This delay will vary, and unless you know when you told it to flash (relative to the current beam position), and where, exactly, in the sweep your detector is looking at, you can't get closer than 16.66 milliseconds of accuracy.

Did I hear someone say "But that's a CRT!. Surely modern flat panel monitors refresh faster than that.", well, yes they do, but they present a whole new problem. Unless you know their refresh pattern, AND you can tell WHEN that refresh pattern starts, you still can't get an accurate time between the flash and the sound.

It might help if you describe more fully, what you are trying to accomplish. Are you timing a program on a PC? On a tablet?

If you want a simple circuit to detect the tone you can make a bridge rectifier (see attached) using 1N914 or 1N4148 diodes and substitute a 1uF to 10uF capacitor for the R(load) resistor in the schematic and then use an analogRead
All the computer tones are going to be 1V peak to peak so with the above circuit you should get at least 1.5V dc which should give you a an analog value of about 300. You can use millis() to record the time by using an “if” statement testing for a value of
val > 200 or val >250. Save the value to a variable and print it to the serial monitor. You can also add a comparator level detector

see attached datasheet for high speed single supply 5V comparator.
You can then use a digitalRead to detect it
and record the event with millis() using an IF statement.


MAX941-MAX944.pdf (887 KB)

THX for the great help!

Its a Sony TV LCD Screen. I tried to get information from sony about the latency from signal input till the picture is build up but never got an answer. Now the outing:) Im a big fan of rhythm gaming and its nearly impossible to get a tight latency-correction with the internal possibilities. The TV has a gaming mode were all picture processing is brought to a minimum. But for a musician like me its hard to ignore the lag of timing between the picture and the sound. So thats the reason for doing this project. In the game there is like 10 pings in picture and sound the variation of latency could be averaged through this pings. I did not regard the problem with the picture build up time - so thanks for that. But i think knowing when to see the flash is ok for that. The time between the beep and the recognition of the short flash is the problem i think.

thanks for the easy method to detect the beep. its coming from a stereo. So i thought that in first place level in the sound from “zero” to something by screening it optical over a display to not overload the analog in. Than start from the beginning measure the timing in a digital like sound - no sound, flash - no flash way.
Maybe 1-2ms accuracy is not possible because of the build up and frequency problems but in music theory a little lag is no prob and the brain is fixing it. As a drummer u get used to it:D i just want it to be as accurate as possible.

The routine shall be:
Level in
Start measure the flash and trigger the guitar on point - i removed a circuit form a guitar and want to replace the mechanical switch with the arduino. so picture and guitar are as sync as possible.
than in the next phase measure the beep and flash and screen the latency on the display to manually correct it.

Thanks again for the great help regarding the electronics!