I've got an idea for a project in which I need to determine the average colour of a VGA signal being fed to a projector. Due to the setup I will not be able to run a USB cable back to the computer and so the usual approach of taking a screenshot to determine the average colour is out.
What I would like to do is take the RGB pins from the VGA output and run them through a filter and then use the signal to determine an average voltage of each pin. This should give me a reading which could be used to determine the average colour of the output.
Is this totally stupid? or does the idea have legs? I'm just not sure what type of circuitry would be able to take the VGA signals and produce a DC voltage that could be read with any sort of meaning full results. Luckily the signal being processed will be mostly one colour or the other with slow gradual changes, so it could work.