Let's say I have 20mA to work with, and I want to light two LEDs. Furthermore, let's say I have 5v to do it, and my LED's have a voltage drop of 2.2v, so wiring them in series is an option.
My question for you is this:
If I wired the LED's in series, using a single 33ohm resistor to provide the array with 20mA, would they be roughly twice as bright as they would be if I wired them in parallel, using two 330ohm resistors to provide each with 10mA?
As a rule of thumb I use 50R as the limit for a series resistor.
The problem is one of variations, the forward voltage drop of an LED has a spread of values, this changes with temperature, age and individual LED. Also if you are driving this from a processor pin the voltage output is not likely to be exactly 5V it could be down to anything to 3.8V.
So if you factor all that in you will see you have not much margin if current is important.
It depends a lot on what you are trying to do. Yes current through a resistor is wasted and if you can put it through two LEDs you get more out of it. But it is more difficult to control. I would look towards a constant current circuit if I had so little head room. But then does the exact current matter?
Okay. Well based on that it sounds like I ought to be safe even if I go with an 82ohm resistor to give 20mA of current with these leds.
Anyway as for head room, I probably have enough head room to do 20mA or even 40mA; I'm just keeping my options open. I haven't calculated exaclty how much current I'll be using in my array because I haven't decided what resisitors I'll be using for that, so that'll affect what I have left to drive these leds. I know I'l be under what's safe for the Arduino, I just don't know by exaclty how much. I've got a few places I can cut current if needed.
would they be roughly twice as bright as they would be if I wired them in parallel
Yes and No. How's that for an answer?
It depends on how you are going to use the LED. If you double the current through an LED the total amount of light it produces will probably be close to double the original. Much of this increase will be in the part of the spectrum that your eyes can't detect, so the change in brightness as perceived by your eyes may not be nearly that great.
My answer is based on some highly unscientific experimenting I did years ago using a variety of red LEDs and some sort of light meter.
AWOL, I think you misinterpreted what what MarkT was trying to show. Mike had previously said that 33 ohms was too small and Mark showed that 30 ohms, which is even smaller, is OK.
Of course it's not really OK as Mike explained in reply #4. The mathematics used in the 30 ohm calculation relies on a specific forward voltage which may or may not actually be the case in a practical situation.