LED brightness never changed.
Our eyes do not perceive brightness linearly.
See
http://www.telescope-optics.net/eye_intensity_response.htmFrom that:
In its basic form (Weber law), this implies that eye response to object luminance, as brightness discrimination, is not proportional to its actual (physical) intensity level; rather, that it changes with the intensity level, remaining nearly constant relative to it. This, in turn, under assumption that the relative value of just noticeable difference in brightness sensation is a unit of the sensation change, means that the perceived object brightness changes with the logarithm of object's actual brightness.
So for one thing, your perception changes as the log of the actual brightness, and secondly, depending on how you conducted the experiment, they may have "seemed" same if there was a pause between different tests.
Just as an experiment, I hooked up a LED connected to a 5V supply via a resistor substitution gadget. I could barely tell the difference in brightness between 100 ohm and 1k although the current must have varied by 10 times.
Certainly, small changes were imperceptible, even 820 ohms to 220 ohms.
So the "it seems to work" doesn't really hold up here. Now if the LEDs are all the same, you might get away with sharing a 1k resistor, but honestly, it is best to do it properly.
A while ago I made a clock (
here ) where I lazily shared a current limiting resistor for all of the segments of a 7-segment display. That should work, huh? The trouble was, the more segments that were on, the dimmer it got. So for example, an 8 (all segments) was noticeably dimmer than a 1 (two segments).
So even for a hobby gadget, the results are just annoyingly noticeable when you finish the project, if you cut corners.