Thanks for all your contributions guys, but maybe it's better to explain my issue properly.
I started with a simple voltage divider, LDR to +5V, 10kOhm to GND and A0 connected inbetween. This actually works pretty fine, I get "stable" readings with enough accuracy for my purpose.
My purpose is to control some lights, turning them on when it's "dark enough" and back off when day's dawning. In Italian this is called a "twilight switch", nice name :).
This means that I don't need to distinguish between "bright daylight" and "almost dark", those could both read full scale for me. But I need to distinguish between "slightly dark", "dark", "very dark", "darker", "darkest", etc. Currently I am not able to do this as my 0-1023 readings are more or less distributed like this:
- ~150-1023: Daylight
- 0-149: Dark
I need far more accuracy in the darkness, sort of the opposite situation to this, with 100-150 steps for daylight and the rest allowing me to differentiate among varying degrees of darkness. This is why I was trying to "amplify" the current with the transistor. I hoped that with the lesser impedance at the transistor output readings would be better distributed. I also tried with different resistors but things don't change much.
The capacitor maybe smoothens the readings, but since I am averaging them in software (I take 10 readings a minute and average them) I might not need that.