I want to use units of 1/1024th of a second instead of milliseconds for a project that has to do a bunch of math in an ISR so I can do the math with bitshifts instead of division and modulo steps. I could use 1/1048576th of a second for microsecond scale measurements. I guess I'm talking about using base 2 to keep time instead of base 10.

The timing needs to be precise enough that I can't just leave the little bit of error in there. But I can do the math to convert the user input into those units as they're entered and let the entire program run in different time units.

It seems like the kind of thing that has probably been done so many times it has a name. So before I start trying to reinvent the wheel, I thought I would ask here first.