Okay but here: http://arduino.cc/en/Tutorial/PWM
I can read:
This on-off pattern can simulate voltages in between full on (5 Volts) and off (0 Volts) [...] The result is as if the signal is a steady voltage between 0 and 5v controlling the brightness of the LED.
So I supposed, limiting the PWM as in my code, will output a steady 3.3V. :zipper_mouth_face: Then, I used a led resistor calculator (the parallel calculator, since I suppose the LEDs of my display are mounted parallel)... which tell me that I need a resistor of 1 Ohm (AKA: I don't even need a resistor), when the input voltage is the same as the forward voltage of the LED, and doesn't matter how much current I enter as the "desired LED current".
Guys, I'm lost...
Why do they always fight "the resistor"? Why resist? Is it "the expense"?
I don't, in fact I planned to add a resistor since the first post. I'm just wondering why I need one when a calculator tell me otherwise.
Put simply, I don't see what is the difference between these two:
PWM 66% ---------> LEDs PWM 100% ---[R]---> LEDs