odometer:
By the way, your interval looks like 11 seconds, not 9 seconds.
At some point in my childhood i think I knew i was starting to rely too much on technology and my brain would turn progressively into guacamole..
odometer:
By the way, your interval looks like 11 seconds, not 9 seconds.
At some point in my childhood i think I knew i was starting to rely too much on technology and my brain would turn progressively into guacamole..
At some point in my childhood i think I knew i was starting to rely too much on technology and my brain would turn progressively into guacamole..
Mine already is guacamole.
mrthekod:
Would timer1 not also be effected if they stop and start interupts?
The timers run whether or not interrupts are enabled. What you miss is overflow events. millis() needs to detect one of them every 1024 ยตS. If you miss one you are out by that amount. Timer 1 is a 16-bit timer so it can count higher (up to 65536) and I used a bigger prescaler so it counted more slowly. Thus you only get an overflow after 4 seconds. Plus you can detect that the overflow occurred, and allow for that, thus you can time up to 8 seconds with interrupts off.
You can effectively "freeze" the data stream during any high level by just holding the data line high. There does not seem to be any time limit for this. No data will be lost.
If you can modify the library, you could enable interrupts during one of these "frozen" high times. Maybe every 24th bit, or the 24th bit of every N color words. If there is a pending interrupt, it will be serviced. Disable interrupts, and keep going.You just have to calculate (or just experiment) how often you have to do this to not miss an interrupt.
Sorry, I need to add a quick edit:
This "high" time has to be for a "1" bit, otherwise a "0" will become a "1". But if you use the LSB, you'll never see it unless that color LED is off. And even then, in a matrix with other LEDs on, you probably would not notice it.