So I'm making a parallel LED circuit using THREE CREE x-lamp LEDs which typically run at ~ 3.1v @ 350 mA, but they can max out at 1000 mA.

So my logic may be flawed here but this is how I'm thinking (sorry if I seem stupid):

My power source is 12v and 1 amp, which means if I use three equal resistors, each LED will draw .333 mA, close to the typical value.

Calculation of resistor: R = V/I ------ R = (12-3.1) / .333 (3) = 8.9 ohms (I believe I'm supposed to multiple the denominator by amount of lights I have, correct? So I would use a 10 ohm resistor.

Wattage of resistor: P = VI ------- P = 8.9 ohms * 1 amp = 8.9 watts. So I would use a 10 ohm, 10 watt resistor.

Is this correct?

Also, say I had only one LED. would it draw that maximum amount of amps from the power source, so 1 amp?