I have put together a project to monitor the temperature of our fridge and freezer. The fritzing schematics (sorry) are attached to this message.
It will calculate average temp, trends and a whole lot of other things. If the temperature is out of acceptable limits then it will sound an alarm and repeatedly blink the LCD backlight. One button is for silencing the alarm via an interrupt and the other one is to manually turn on the LCD backlight should one want to see the screen in the dark.
In other words, it is an over-engineered thermometer.
The problem I am having is that my temperature sensors are analog. Thus they need a constant voltage to return accurate and calibrated values.
This is not the case however. Voltage varies a lot:
When I power the whole thing up with a 5V 2A AC adapter, voltage on the 5V pin and the Vin pin is 4.55ish.
When the screen is blinking/the backlight is on, it drops to around 4.30V
If I plug in both the AC and the USB, the 5V and Vin pins are around 4.67V
It drops to around 4.40V when there is backlight activity.
I checked the output of the AC at the plug and it's 5.39V
Where does my problem stem from? Is my whole “system” using more that 2A? It seems like a rather hugh value doesn't it?
How can I make the 5V deliver 5V so that I don't have to tweak the code every time I change the way it's powered?
I plan on moving everything to a PCB once these problems are ironed out powered by a standalone atmega328 without a USB serial interface. I was thinking I could power it with AA batteries. Is it feasible or is it using too much juice?
I'm not very experienced to all things electric/electronic so please be patient and detailed in your answers.
Thanks for your help!
--edit: A more readable image size than what the forum offers