a perplexing LED situation

My values: source V = 9 , LED forward voltage = 4 (white), LED amps = 30 mA, 2 LEDs

I understand ohms law and using that I would need a 33 ohm resistor.
When I input the specs. into the LED array wizard, I get this:
Solution 0: 2 x 1 array uses 2 LEDs exactly

R = 39 ohms
The wizard says: In solution 0:
each 39 ohm resistor dissipates 35.1 mW
the wizard thinks 1/4W resistors are fine for your application
together, all resistors dissipate 35.1 mW
together, the diodes dissipate 240 mW
total power dissipated by the array is 275.1 mW
the array draws current of 30 mA from the source.

SO.......when I run the circuit my meter reads .054 amps in the circuit. I am using a 39 ohm resistor. Why is the amperage so much higher? I checked the source and it is a stable 9V. its weird....any thoughts?

THANKS

aaargh. so after double checking, the source voltage is 9.42. I added all my extra one ohms and brought resistance up to 41 ohms and still getting 54 mA.....

You need to measure the voltage drop across the LED. Then ohm's law will make sense.

You have the LEDs in series, yes? Not in parallel?

Could be the LEDs are dropping less voltage - measure across them, and adjust the resistance accordingy.
Also, 30mA sounds like a MAX #, calculate back to 20mA.

(9V - (2 * 3.50 )/.02A = 100 ohm for example

thanks again...