# VGA Colour Detector

Hi,

I've got an idea for a project in which I need to determine the average colour of a VGA signal being fed to a projector. Due to the setup I will not be able to run a USB cable back to the computer and so the usual approach of taking a screenshot to determine the average colour is out.

What I would like to do is take the RGB pins from the VGA output and run them through a filter and then use the signal to determine an average voltage of each pin. This should give me a reading which could be used to determine the average colour of the output.

Is this totally stupid? or does the idea have legs? I'm just not sure what type of circuitry would be able to take the VGA signals and produce a DC voltage that could be read with any sort of meaning full results. Luckily the signal being processed will be mostly one colour or the other with slow gradual changes, so it could work.

Any ideas?

Is this totally stupid? or does the idea have legs?

Does not sound nuts but you need quite a high frequency I think otherwise you will miss plenty pixels.

What is the goal of the project? sort of ambi-light?

Any ideas?

You can use a LED as a colorsensor too (search this forum) and just use 3 LEDS to watch the screen ...?

Yes, it's sort of like an ambi-light project, but I cannot connect digitally to the computer . The laptop will be about 10 meters from the projectors (two projectors) which is too far for the USB cable to run.

A colour sensor won't work either because it will need to sit at the projector, not near the projected surface.

I think if I run the vga out put for each of the RGB lines through a rectifier and then through filters I should get an average voltage for each line. When the image is mostly blue, for example, I should get a higher reading on the blue line.

These values are then read to produce the required colours.

When the image is mostly blue, for example, I should get a higher reading on the blue line.

so if your sampletime is 2t milliseconds and the first t red =255 and blue=0 and second t red=0 and blue=255 you wil have an average of red=127 and blue=127 resulting in light purple as the average color. Think it will be difficult to determine what color is most abundant correctly because of the averaging but it could work (and I don't see a better option yet).

So I've been playing with VGA outputs and get a definite signal difference between screens showing prime colours. Now I need some code that can take about 500 readings from three separate inputs and average them out.

The signal I'm getting is AC peaking at about 1v or so. I'm hoping the arduino can read these levels with some accuracy to take the samples.