Hello, I am once again trying to wrap my head around inverters.
Let's say I have an appliance that uses 1000 watts and its maximum current is 10amps.
I also have a 2000w inverter and a 12v battery.
Is the math correct to say 1000w/12v = 83.3A
My question is,
Is it the battery that must be able to output 83.3 amps to the inverter, and it is the inverter job to put out a regulated 110v at 10amps(required by my appliance), so it is not my appliance that needs to handle the 83 amps, but instead my battery to the inverter?
I'm pretty sure this is correct but I just wanted to make sure.
P.S I realize that I didn't include the type of battery or amp-hours, more of just a broad understanding.
Thanks
Depends on the load on the inverter, these are rarely run at full power. Say at a load of 1000 watts, the current will be 1000 /12=83 amperes. 400amp hours /83amp =4.8 hours. Multiply by 0. 8 to allow for efficiency and the time is about 3.8 hours. It will be less than that however because the battery voltage will drop as it discharges causing more than 83 amps of current to flow.
High power inverters are a lot more efficient that this, to keep the costs down. Heat dissipation costs
real money and efficiency is easier to achieve at scale, typical 2kW inverters are more like 95% and up.