12V & LED's

You would use ohms law to figure out the resistor your need... like the other answer stated it would be roughly a 100ohm res but here is why...

You say you need 3.5V to the leds... and your supply voltage will be 14.5V... so by subtracting the 3.5V from the 14.5V we need a voltage drop of 11V. So now we know we need to drop 11V. Find out what the amperage of the LEDs are... lets just say they are 40mA. So you have 3 LEDs... so you have 120mA (40mA x 3). So now we know the voltage drop we need, and the amperage we will have.

To find the resistance value we need we take the voltage (in V) and divide it by the amperage (in A). So we need to convert the mA to Amps by moving the dec place... so we get 120mA = 0.120A. So now we get the following Resistance = 11V/0.120A, which means Resistance = 91.66 ohms.

I hope that makes sense. So if you had 14.5 volts and a single 3.5V 40mA LED it would look like this: Voltage drop = 14.5-3.5 = 11, then we do the rest of the math R = 11/.04 , R=275 ohms

If you had 14.5 volts and two 3.5V 40mA LED's it would look like this:
R = 11/.08, R= 137.5ohms

If you had 12 volts and three 3.5V 40mA LED's it would look like this:
Voltage drop = 12-3.5=8.5V, Amperage = 3x40mA = 120mA = 0.120A, do the math R = 8.5/0.120, R=70.83ohms.

Anyways... you get the idea... I hope this helped some.