I'm a curious hobbyist/noob with a little basic knowledge in electronics... and I have a question about resistors in parallel...
I did build a small DC 5 volt circuit to fool the Power bank's Auto Shutdown current limit to stay always on with a load dummy of 39 ohms. So I can run these low current draw applications with it.
5/39= 128mA load dummy... This comes on via an ATTiny85 and an NPN for 1 second every 110 seconds... I added a small 2.2v-20mA LED to blink with it, to see it's on as long the LED blinks.
Now I connected the LED from the VCC with a 150-ohm to the Anode to give the LED the proper current to function at 5 volts. The 39-ohm resistor is in parallel from the VCC to the LEDs Cathode and this goes via the NPN to the common ground. I did paint a crude diagram of the circuit added as an attachment. To give a better picture of the whole I hope. It concerns R3, R4 and D4. The circuit works like in real life on a breadboard and the power bank is already on for over 14 days. Thus so far so good I got it working without any flaws, yet.
I did build a simulation of this in TinkerCad and in it, the dummy shows a current draw of 133mA... Pretty close to the theory.
But what I do not understand is when I take the LED and its resistor out of the circuit the load doesn't change it stays the same? At least in Tinkercad, it does. Is this a TinkerCad flaw? Or is it a matter of chance that they cancel each other perfectly out?
I understand from theory the current draw should change with parallel resistors added? I can find no information about how this works in this case with a LED in the circuit as well. Does the current draw from the LED cancel out that 150-ohm resistor and does the 39-0hm load dummy simply stay the same in this case?.
I would like to understand how does this really works? And how do I calculate something like this?
Oh, I might add, the NPN switching the 3 x 12-hour LEDS draws only 50mA.