I built a project that uses an ISL29125 RGB sensor connected to an Uno with a SparkFun MIDI shield to convert color and light to musical notes. It works fine in most aspects except when the light source is not continuous. With LEDs dimmed by PWM, fluorescent lights, or analog NTSC video screens, I get a variation in the reading from the pulse of the source. That jitter can carry over time as the sensor reading timing goes in and out of phase of the pulse rate of the light source. The timing ability of Arduino is not consistent enough to just hardcode the refresh rate of the video (60hz).
Can any folks here recommend a method of programmatically filtering out the scanline jitter and only yielding variations of brightness and color of the actual image on the screen?
Hardware setup:
Raw sensor readings, averaging red green and blue. Coarse jitter.
Smoothed average reading. Combines last 10 values and yields and average of that stack.
Attached is my basic INO to test smoothing options.
My end goal is to measure the changes in brightness of a video image on the screen, producing notes when it changes significantly. The video itself would produce a wide variety of values over time. Differentiating those value changes from inherent signal jitter is my challenge.
Suggestions?
rgb_normalize.ino (1.01 KB)