Hey everyone reading this, I have a question I would like some clarity on and maybe even a better understanding of.
I have an experiment that involves powering a lithium coin cell (3v) powered device.(a car key fob)
I have 5 volts already in my circuit that i am using to power the Arduino pro micro, it is supplied by a three legged voltage regulator(3amp).
I measured a brand new lithium coin cell, and it measures about 3.3v, and my personal experience leads me to believe that when you finally replace the cell, they measure under two volts. so I don't think that they are too sensitive to voltages being exactly 3v.
I don't really want to use another regulator, as I don't have one.
I am considering using a couple of resistors to make a voltage divider, as a key fob must only use a TINY amount of power(current) being that they can run off said coin cell for years.
I calculated that a 1k and a 2.2k resistor puts me at a close enough voltage to mimic a coin cell voltage, but I am wondering, what are the repercussions of doing this?
Using what I have, the only other thing that I can think of using based on what I have at hand, is a couple of diodes in series to drop it from 5vdc to 3.6vdc(ish).
What do you guys think? the Voltage divider route is way cheaper than diodes. the resistors I have are 1/4watt
I would love to hear what you guys think, as I have never tried powering anything from a voltage divider before. It is such a tiny load, and when my circuit presses a button, it will only be transmitting for a fraction of a second so I am thinking that there wouldn't be enough time for it to even come close to warming up the resistors.
The other thing that I was curious about re voltage dividers is does it matter which position the different value resistors are in? higher value closer to ground? other way around? I googled this and can't really find a clear answer.
Thanks in advance, Michael