Screen lag timer

I’m making a input lag detector with a Uno and an xbox 360 controller and console. I have a light dependant resistor (LDR) positioned appropriately on the tv/monitor and I’m using the game Halo: Reach as a test bench.

When I press a button on the breadboard, the arduino simulates pressing the right trigger on the controller and causes the game fire a shot. The program measure the time it takes from the signal being sent until the LDR detects a change in brightness (caused by the muzzle flash of the gun in the game) on the tv/monitor.

I’ve tested two tv’s but the latency is very consistently the same on both tv’s. They both read 134 milliseconds. I was expecting something a little variance. I double checked the program by firing a shot, and then using a torch to trigger the LDR and it seemed to vary accordingly. I got values as low as a few milliseconds to over 10 seconds depending on when I shone the torch on it so the code must be working properly.

My issue is that one tv has noticeable input lag that I can feel when I’m playing meanwhile the other tv feels like it has no input lag whatsoever. Why does my data not seem to support this noticeable latency? My only thought, if the code is correct, is that the LDR has a high rise and fall time (high response time) so it isn’t that accurate at detecting highspeed changes in light?

Any help would be appreciated

int ldr = A0;              //light dependant resistor
int lightRef = 0;          //reference light level to measure against
int tolerance = 10;        //tolerance for light level to prevent accidental triggering
int btn = 12;              //right trigger button
int startTime = 0;         //to store initial millis() data
int endTime = 0;           //to store store final millis() data
int btnStart = 8;          //button to start the entire function

void setup() {
  pinMode (btn, OUTPUT);        //right trigger as an ouput
  digitalWrite(btn, LOW);       //ensure right trigger is not for whatever does not accidentally shoot during setup
  pinMode (btnStart, INPUT);    //start button as input

void loop() {
  if (digitalRead(btnStart) == HIGH) {              //if the start button has been pressed, do the following:
      lightRef = 1023 - analogRead(ldr);            //take a reference light level before a shot has been fired to compare with. (1023 - [sensor reading]) converts inverse the relationship of light level and sensor value
    for (int i = 0; i < 15; i++) {                  //shoot 15 times only (to prevent timings stuffing up because of reloading)
      startTime = millis();                         //start timing by storing the current time
      digitalWrite(btn, HIGH);                      //simulate pressing the right trigger on the controller to cause the game to fire a shot
      while (1023 - analogRead(ldr) - tolerance < lightRef);      //do nothing untill the LDR detects a brightness thast is greater than (reference + tolerance). This is when the gun fires and creates a muzzle flash.
      endTime = millis();                           //upon exiting the loop, store the time.
      Serial.print("Shot ");                        //serial commands...
      Serial.print(i + 1, DEC);
      Serial.print(": ");
      Serial.println(endTime - startTime);          //endTime - startTime gives total time elapsed
      digitalWrite(btn, LOW);                       //release the right trigger button
      delay (1000);                                 //wait a second to ensure the gun is ready to fire again
  delay(3000);                                      //wait 3 seconds after firing 15 shots to allow for a reload

My only thought, if the code is correct, is that the LDR has a high rise and fall time (high response time) so it isn't that accurate at detecting highspeed changes in light?

Yes, LDR's have a slow response time. Also response varies from dark to light, light to dark, light wavelength and light intensity Also, LDRs have some memory effect. If you research, you may find some faster ones, but here is an example: