 # Getting a 1V voltage source

Hi,

I am currently designing a circuit for a sensor, and I wanted to know if anyone has any ideas about how I could get a 1V source on my circuit. I will use this voltage across my device, and I need it to be very stable (as the device works in the nano amp range).

Some ideas I thought of were:

Using a battery — I could do this but I want only a single battery for my circuit (it powers up the Arduino)

Using a zener diode — I could do this, but I want to keep the voltage as close to 1V as possible, and I can’t find any commercially available regulators that are good

Using a voltage divider — The problem would be power dissipation, and the input impedance of my sensor is only ~100 kOhm - 1 MegaOhm, so having it in parallel with the resistors could affect the actual voltage across the device

Using a voltage divider — The problem would be power dissipation, and the input impedance of my sensor is only ~100 kOhm - 1 MegaOhm, so having it in parallel with the resistors could affect the actual voltage across the device

That doesn’t make sense. With very high impedance load the power dissipation in the potential divider is chuff all.

Why not tell us what sensor you have instead of making us guess?

The sensor is made from mono-layer MoS2 and I'm using it to detect certain proteins. The issue is that the devices I have aren't very good, so the current is initially around 100 uA, but over a period of four hours it eventually stabilizes to around 10 nA.

I was thinking that a voltage divider would be my best bet, but I'm not sure how much current the device will take initially. I would rather avoid frying a device, but I guess I could sacrifice one just to see what happens.

@Riva, thanks! I think that would be perfect.

That ADR510 looks like a good option. :)

May I ask what is the voltage tolerance % of your sensor?

In the good old days we'd use a 'standard cell' giving 1.018638 volts, see https://en.wikipedia.org/wiki/Weston_cell

A voltage divider will rely on the voltage it is supplied, so you are left with only options 1 and 2. How much will the power battery decrease in voltage over it's lifetime? Is the decrease a function of time that you can calculate? If you know these two answers, then using the power battery with a voltage divider may be a solution.

Since you are using an Arduino, do I not recall that if you select the internal analog reference of 1.1 V for the ADC, that it will in fact output this voltage on Aref?

Some processors do. Some processors don't. If a 0.1 μF bypass capacitor is called for then it does.

However, this...

narin101: ...and I need it to be very stable (as the device works in the nano amp range).

...may preclude the use of the internal reference. It does vary a bit by temperature and a tiny bit by Vcc.

Using a voltage divider --- The problem would be power dissipation, and the input impedance of my sensor is only ~100 kOhm - 1 MegaOhm, so having it in parallel with the resistors could affect the actual voltage across the device

Have you done some sums on this?

Use a divider with a total of say 200R, so on 5V that will draw 25mA with a power dissipation of 0.125W, not much more than an LED.

Let's say your values are 40R and 160R to give you the 1V across the 40R. Now the voltage with no load is 1V. The voltage with a 100K load is:- 0.99968V That changes when the load is 1M to:- 0.999968

That is a change in voltage of 0.999968 - 0.99968 = 0.000288 or 0.288mV - The noise on your voltage supply is likely to be way bigger than this.

That is way less that the variation you can expect to see using a 1V regulator.