surely a better value than the hardcoded 100 (ms) ( the expected time for your code to execute in ) would be the parameter calls natural size ?
ie. if i'm wishing to run something every 150ms that being suitable for going into an unsigned 8 bit var. then me ( the programmer expects this bit of code will execute within that 150ms, so that it can called again 150 ms later.)
so double the 150, and then see what minimum size is needed for it to allow overflows. in this case it will be a 16 bit var.
( this being 2 * 150 = 300, which is greater than 256 so we move to next range, ie 65536 )
this way, you wont get any "run after x" routines not running by overflows of the vars subtraction in the macro.
likewise if i want to run something every 2000 ms. then 4000 is within the 65536 scope of watching the 32 bit long millis var counting. so the 16 bit var is fine.
if i'm wanting something to run every 60000ms ( 1 minute ) then prehaps my routine might take say 59 seconds of that minute. and so, we need to watch using the 32 bit sozed var.
or am i missing something here ?