LED Fading

I have been playing around with Light dependent resistors using them to fade an LED.

I am confident my sketch works correctly however i dont see a huge dimming of the LED. For instance when I set the LED using analogWrite to say 1 the brightness is very low but anything between 10 and 255 just seems to be the same brightness. I cant seem to get it to vary all that much? Is this a characteristic of LEDs or is it something I am doing wrong?

I know the code works as I can do a simple switch on and off of the LED using the LDR no problem which I guess proves my wiring.

Any help is greatly appreciated.

It would help if you would post your code. Also check that you are using one of the PWM pins. The LED needs to be one either pin 3, 5, 6, 9, 10, or 11 otherwise it won't fade. It will either be on or off.

It could be that you are pushing too much current through the LED, what value of series resistor are you using?

My bet would be that you are expecting your LDR to output 0 to 1023 via the analog input, but it's (for example) going from 640 to 890. Output the values to the serial monitor to get a maxima and mimima, then map the values to 0 - 255 for the LED output.

LED brightness is normally controlled by PWM techniques just like a motor. You should not try to vary the voltage, but rather vary the on-time. It works quite well using Arduino. I normally use "state machine' code and a digital output pin.