LED matrix - individual LED brightness

Hi there,

I was talking to a friend who is trying to make his own LED matrix (they're making their own LEDs from scratch in university so they can't just buy a module) and he asked me if he could short all of the cathodes of their LEDs together and use a single resistor, instead of using 1 resistor per LED. I said that he could, but then that would mean the brightness of the LEDs would change as the total current through an LED would depend on how many LEDs were powered.

After doing some online reading though I have found that this is actually what they do in a number of commercial LED matrices:


Image from circuitstoday

My question is, does this actually happen to matrices in real life? is it the change in current so negligible that the change in brightness isn't observable?

I have also found this image from circuitstoday and it does appear that the brightness of a row/column is dependant on how many LEDs or powered:

That is a terrible design. :roll_eyes:

I am going to ask- what "commercial LED matrices" these are? Please cite.

As we frequently point out here, the 74HC595 is not designed to drive LEDs, let alone LED matrices. Its output drivers are not even as robust as the ATmega and the brightness variation depending on how many - row LEDs in the case illustrated - are illuminated will be particularly noticeable.

But driving matrices with such "bodgie" arrangements is just entirely pointless when proper matrix drivers - in particular the MAX7219 - are readily available and specifically designed to drive a matrix consistently, reliably and consistently brightly.

1 Like

I see, now I understand the problem with the above example, thankyou for providing the example driver!