Analog Sensor

I have a sensor (analog output 0-1v - its a ignition advance output from a ECU, the decimal voltage represents the advance (eg 0= 0 degrees, 1 = 100 degrees). I'm trying to simulate this with a voltage divider (to take 5V to 1V and a pot to give me the the ability to change the voltage between 0 and 1
I have the divider with a 100ohm and a 22ohm (did not have a 25ohm, so my range is really 0 to 0.9V). Since I need pretty fine granularity, I want 1v to be 1023 from the Analog input, do I have to change the reference voltage to be 1v as opposed to the default 5v (right now, .9v approximagely 180 (varies somewhat between 180 and 185, which I'm taking as the tollerance of the resistors and the pot...

analogReference (INTERNAL) ; will switch the analog inputs to use the internal ~1.2V bandgap reference which will give you more resolution.