The electric fence consumes between 170-260 milliAmpere at 12 Volt, depending if vegetation touches the fence.
The extra load of a Attiny1624 I estimate at 5 milliAmpere. If I use a LDO it will still be 5mA at 12V, if I use a switching regulator it should be less. Once I have built my prototype I will measure consumption and decide if it's worth putting a sleep function in.
I like your idea of driving the reference from a IO pin, but I would connect the 2.5V ref to the AREF pin and be done with just one conversion.
The ATTiny1624 input resistance of the ADC is also 10K (datasheet chapter 33.16).
I am ok with programming all by setting up registers directly, though that usually takes me a lot of time. I find the modern ATtiny registers quite a bit more complex than the old ones.
Luckily @DrAzzy put a lot of effort in supporting and documenting the ADC for megaTinyCore, so I am studying that as we speak.
I will ask the ADC for a 16 bit resolution with the analogReadEnh() function. that should do 1024 samples in one go and drop the least significant bit. At the risk that I am just measuring noise with the last bits. I would be very happy already if I get a stable 12 bits resolution.
You need to interpret that in the light of 30.3.3.5.2; its NOT the input resistance, its an internal resistance that doesnt affect the voltage applied at the input. The ACTUAL input resistance (ie Vin/Iin) is VERY high. I've shown a 10M resistor doesnt change the measurement significantly (ie within the limits of precision of the converter).
A high input resistance in the external circuit will increase the settling time - but since you are measuring from the same input terminal each time that isnt relevant.
"After switching input or reference, the ADC requires time to settle."
Yes I dont see why you cant apply the reference voltage directly to VREFA
I'd add a temperature sensor too. Take a measurement at room temperature then one after the setup has been cold acclimated and use map() or generate a function to compensate. Take the guesswork out of it.
@johnerrington That's a good one. I thought I red it somewhere about the MOhm impedance, but did not find it in the datasheet.
I was thinking about a 20K-10K divider, that would take 0.4mA from the battery. I can double those values to get to 0.2mA.
Plenty of time to add a delay() before a ADC conversion to get the reference and ADC settled upfront.
@apf1979 The Attiny1624 has a temperature sensor onboard, even with calibration data in the sigrow. Datasheet chapter 30.3.3.7. Never worked with it.
That is about the time it takes for the external circuit to charge the sample/hold capacitor.
But there’s also leakage currents that may affect the voltage at the input if the resistance of the external circuit is high.
True. All I can say is that I tested this on a Nano and the 10M did not change the ADC reading by more than 0.5LSB (using oversampling)
I dont expect much difference on other MCU's but of course some may have higher native resolution ADC's.
I've repeated the test with a 300M resistor, and 16384 oversampling, then taken the raw summed ADC value and divided by 16384. The "error" is less than 1 adc digit.
Results:
A0:R=5k6
A1: R=300M
740.41
740.24
Since 1 LSB = 5/1024 = 5mV the leakage current is only a few pA.
The input resistance of the pins are somewhere in the 1-10GΩ order of magnitude. Even a 10M source resistance is two orders of magnitude smaller, so will have little to no effect on this.
Not that it such a high source resistance is a good idea
The datasheet gives a typical leakage current of <0.05μA. A 10kΩ resistance then gives 0.5mV deviation. Worst case scenario: 17 bit precision, internal reference 1.024 V, 1 LSB = 7.8μV, the deviation being 64LSB.
To put this in perspective: The datasheet gives a Total unadjusted error (int. ref. 4.096V, resolution 12bit) 30LSB max. (Which is better than 1%.)
Of course you are allowed to calculate your own case, to obtain the precision you want/need.
The objective is a warning when the battery voltage gets low. Tentative guess: 11.5V, give or take 0.1V.
You might use a divider 27k/10k and internal reference 4.096 V.
The alternative is not much better:
Choose a divider 56k/10k and external reference. Total unadjusted error 10LSB = 0.3%
Add the accuracy of the TL431: 0.5% and the resistors: 0.1%, giving a total of <1%
Total required components: 3 resistors. a small capacitor for noise suppression and a TL431.
I have just finished rigging up my prototype.
I chose the voltage divider (lower right corner picture) 47K - 10K. So I get less than 2.5V and can use the TL431 (top right corner) or the 2.5V internal VREF with the same divider. Curious how they compare, probably no difference
I took a TL750L as 5V regulator, because it has the lowest quiescent current (1mA) of the LDO's I have.
Top center is the small MOSFET that will turn on the power to the GSM/GPRS shield, which will connect via the bunch of pins in the top left corner.
4 pins left is an I2C breakout, so that I can hook up a LCD for debugging and other info.
The blue connector is for connecting the 12V battery.
Now I need to start writing software.
And if everything works, I may need to redesign everything into a compact SMD print, including a smaller GSM/GPRS module. And cast it in epoxy or plastidip. It will have to survive outside in a field box. This prototype and GSM/GPRS shield won't last long there.
Oh. and I use a ATTiny1626, so I can save my smaller ones if size is important.
I am pretty happy with the results of the voltage measuring.
With below test sketch I measure the voltage at my voltage divider with a resolution of 0.1 mV. I let it sample 10.000 times and then print the min, max and difference between min and max.
Each analogreadEnh() takes 1024 samples and provides a 17 bit resolution, of which I discard the lsb to get a 16 bit number. So in fact for this test it did 10.240.000 ADC conversions.
With the internal 2.5V reference I typically get around 1.6 mV min-max difference and with the external TL431 0.8 mV. Both with a stable temperature.