Hello, I have 2 devices, 1 rated for 12v 1.7a and the other 5v 2a (connect to power with 12v to 5v converter). My power source is 12v 3a. If 1 device draws 20w will it destroy the other device which is only rated for 10w as they are both connected to the same source?
In other words, It is a power supply with an output capacity of 36W (12V * 3A).
First, Device "A" consumed 12V 1.7A, which is 20.4W.
The power supply still has 15.6W of spare capacity.
Second, Device "B" consumed 5V 2A, which is 10W.
But the with DCDC converter connected.
The converter does not convert voltage with 100% efficiency.
This time about 80% using to calculation.
5V * 2A = 10W
10W / 0.8 = 12.5W
12.5W = 12V * 1.05A
The converter consumes 1.05A from a 12V power supply (12.5W, because 80% efficiency) to output 5V 2A (10W).
Now, The total output of the power is 20.4 + 12.5 = 32.9W.
The current is 2.74A at 12V.
Don't you think this fits in the power capacity you described?
My question wasn't about power capacity, I'm just wondering if when Device A pulls more power than Device B is rated for if it will damage device Device B since they are connected to the same power source? And if not why this would happen.
When used correctly within the rated capacity, Basically loads connected in parallel don't affect other loads.
Let's talk about electricity as a water supply.
Your house is connected to a single water pipe and is supplied with water from city. (Power supply)
You are take a little water into a glass, at the kitchen. (Device B)
Your family went to the bath and drained a lot of water from the shower. (Device A)
This will did a lot of water come out of your faucet in the kitchen?
The consumption of each has been summed, and the amount of water flowed from houses water pipe has increased.
This is the same as the current calculation I posted earlier.
Each device draws only the current it needs.
Even if multiple devices are connected in parallel, the current of each devices doesn't change.
But the power supply need output that total current.