Need help with a "timer on delay"

Hello guys, I am a complete newbie trying to develop code for a school project so I could really use the help of the experts. What I am basically doing is using a long range IR sensor to read the position of any object within range. Now I want to incorporate a timer so that if an object is within the 10cm to 100cm range for longer than five seconds or so, my if/else routine springs into action energizing the necessary pins to drive a relay to activate the rest of my circuit, but if the object moves out of range within the five seconds, the routine does not run. I'm trying to prevent "false starts". Thanks in advance, guys and gals.

Here’s a simple setup. You’ll have to declare all the variables yourself. As long as ‘distance’ is outside your desired range, start time will == present. When it gets into the desired range, start time will be frozen in place, and the present will keep increasing. Once 5000 milliseconds pass you can do something.

//in setup
long unsigned starttime =0;
long unsigned present = 0;

//in the main loop:

present = millis()

if (distance < 10 || 100 < distance){   // if it's outside the desired range 
  starttime=millis() ;                 //then the timer is constantly reset
}
  
if (present-starttime > 5000){
 //do something 
}