...but yes my LEDs are in series with a 270 ohm R. In series, the voltage gets divided across all of the series components.
LEDs basically "turn on" at about 2V (depending on the LED). And, the voltage holds at about the same value when you increase current. Essentially, the resistance drops as you increase current.
If you put 3 LEDs in series, it's going to take about 6V to turn them all on, plus you get a voltage drop across the resistor. This is OK as long as you have enough voltage to "spread around",
and as long as you calculate/measure the voltage drop across the resistor to determine the resistor value (using Ohm's Law). With more voltage dropped across the series LEDs, there is less voltage remaining for the resistor. This means you need a lower resistor value with more LEDs.
If you put 3 LEDs in parallel with the same current-limiting resistor, they might not all glow at the same brightness, because although they all have the same voltage across them, but they may have slightly different resistance characteristics and therefore may have different currents flowing through each.
So the drain/source current is proportional to the gate voltage?
I think so. (It's been awhile since I studied MOSFETs.) But, we are often operating the MOSFET in "saturation", where the current is limited by a resistor.
We are using it as a switch, where it's either full-on or full-off. When it's full-on, the current might "calculate" as 10 Amps, but your resistor (or other circuitry) might limit the actual current to 1A or so...