Then, I used a led resistor calculator (the parallel calculator, since I suppose the LEDs of my display are mounted parallel)... which tell me that I need a resistor of 1 Ohm (AKA: I don't even need a resistor), when the input voltage is the same as the forward voltage of the LED, and doesn't matter how much current I enter as the "desired LED current".
since it will drop voltage even more so the LEDs won't be at full brightness?
I will do what dhenry said in his last post, and use PWM's max value instead of trying to limit it.
the voltage across an LED also depends on it's current, but not linearly as with a resistor.
I've used 2 diodes 1n4007 insthead the resistor ( series ) and it works great.
Once again: if you to try to design a circuit to drive an LED with some specific voltage you are doomed to failure.
The current consumption respectively 86-105-118 mA.
Contrary to conventional "wisdom", high power LEDs behave far more like a resistor than a diode.
Is that what we are talking about here?
I would think that any LED backlight that is being driven by an output pin of the Arduino would be classified as low power, and it would behave much like any diode, not like a resistor.
Myself and other contributors such as Grumpy Mike and Liudr have, for several years, been giving advice on why this reasoning is incorrect an why they always need current limiting, usually in the form of a series current limiting resistor, when driving an LED.
Keep in mind that the target audience for the Arduino is not someone with an Engineering background...
The bottom line is one can mess around with PWM, diodes, or some combination, but one should still use a series current limiting resistor with their LED.