I apologize in advance as this is probably a dumb question..but I've had a brain fart and am not sure if my thinking is on track. I built a 5V 6A power supply (30W max) for a LED sign, and while I've measured the voltage using a multimeter, I want to test the current delivery ability to see if it can really generate up to 6A (and how warm the board will get to test thermal dissipation). Since I haven't finished with the sign yet, I can't test with a multimeter in series with the supply and the sign, so I was thinking of using a resistor across one of the supply's terminals, and my meter.
Supply (+)---Resistor---Meter + Supply (-)----------------Meter -
I was looking at a resistor like this http://www.bourns.com/data/global/pdfs/PWR221T-30.pdf that can handle 30 watts. Is this correct in a way to test with no load present? I'm trying to measure if it can accurately produce a 6A (or even 4-5A) supply.
PS: OOC = Out of Circuit