Hopefully I'm posting this in the correct area, I figured this being the "power" section this may be the best lace for this. Apologies in advance if this is a rookie question, I'm still new to the hardware side of things as my background is mainly on the software side.
First of all my understanding is this:- A battery has a voltage (e.g. 1.5v) and "capacity" (e.g. 2000mA/h). When you connect this in a circuit the battery will output 1.5v but the current is not fixed to 2A necessarily. The current in can pretty much be anything as it's determined by the voltage and resistance in the circuit (so it can be 500mA or 4A depending on what's in the circuit).
Assuming the 1.5V is adequate for the circuit components, if the current in the circuit is 500mA in this case, the battery will last 4 hours. If it's 2A it lasts 1 hour, 4A 30 mins and 20A 6 minutes.
First of all, can you confirm that understanding is correct or have I got this absolutely wrong?
As you use a battery, does the voltage remain constant but the capacity is drops as it is used, or will the voltage also drop over time?
For instance, if I have a 1.5v battery that is 2,000mA/h, if the circuit draws 1A and it is used for 30 mins, does that mean it will be 0.75V or will the voltage remain the same? Or does the current that can be drawn from the battery start to drop?
Thanks in advance for looking at my newbie post!