Still not seeing it. The time that millis() or micros() turns off interrupts is MUCH smaller than the time interrupts are disabled by the occurrence of any interrupt service routine.
My comments are essentially identical to what "dBC" said on your link:
Surely there'd have to be two interrupts from the same source during the disabled window before you'd lose anything? Otherwise the interrupt just gets delayed until they're globally re-enabled. Looking at millis() for example, it disables them for about 9 cycles, or 562.5 nsecs at 16MHz. So worst case, if an interrupt fired just as they were disabled, that event would get delayed by 562.5 nsecs. Is the PLL code really that sensitive that it can't cope with 1/2 usec of jitter in the timer interrupt?
Calling millis() or micros() "very frequently" in the main loop would be an example of the "badly coded" comment I threw out. Although, it's not clear that there is a great alternative. I think you'd be better off asking for a delay() or whatever that doesn't call millis/micros.
You haven't found a bug, you've found a requirement of your code that the arduino environment doesn't meet. Since people who should know seem to define a "hard real time" OS as one with less than 100us of jitter in interrupt response time (http://www.lisha.ufsc.br/wso/wso2009/papers/st04_03.pdf
), even on cpus that are much faster than the AVR, you're likely to have a hard time finding anything better.