Normally, we use delay to blink an LED, but delay is bad, so we use millis() to avoid freezing up the MCU. but millis() is kinda bad because it's complicated to use. For timing, why not just use the loop?
We don't want the CPU of the MCU tied up just counting off time.
We want -other- things to run while the led blinks, and those other things would change what count is needed to make a 1 Hz blink. When those other things include occasional user action responses there is no way to tell what count is needed one blink to the next and we use millis() or micros(). A timer interrupt could work but that's how micros() and millis() work and they let us time loads of events.
What's so hard about millis()?
The unsigned longs? They're like the hour hand on a round clock only they count past 4 billion instead of 12.
As long as you use unsigned variables and subtract the start time from the end time you will get the difference (subtract to get the difference) up to the maximum count which for millis is 49.71-some DAYS. With only the second hand of the clock you can get 59 seconds, with millis you get almost 2 months as your longest interval.
And it never makes a difference if the end is less than the start, rollover does not matter with unsigned math, you always get the difference up to one time around minus 1.
time_difference = time_end - time_start;
On a round clock suppose it is 2 o'clock and you arrived at 10 o'clock and you want to know how long you've been there. The hour hand is at 2 so you subtract 10 by moving the hand backwards 10 (start time, when you arrived) and the hand now points to 4. You've been there 4 hours, clock 2 minus (move backwards for minus) clock 10 is clock 4. Same way for unsigned, the round clock is unsigned base 12.
BTW, I use unsigned int with ( millis() & 0xFFFF ) to time events of 1 minute or less.