Look at it this way;
power supplies (whether it's a battery, usb charger, switch mode supply, adjustable supply etc) all will have 2 factors. How much voltage it puts out and the MOST current it can supply before damaging the unit (usually by heat).
The modules or leds or mcu's you attach to that power supply require 2 things to operate correctly. The correct voltage input, and a supply that is capable of supplying AT LEAST the max current it will use.
Think of it this way- a car has a 12v battery in it. it can spin the starter, energize all the relays, computers, and lights. That's a lot of current, but still 12vdc
Now take 8 D-batteries and put them in series. That's 12vdc. Energizers or Duracells, it doesn't matter, they won't even light up the headlights let alone the starter, but will readily light up some leds on the workbench.
Those leds will also happily work if you take them out and hook them up to the car battery.
Both supplies are 12vdc, but rated at vastly different current levels. Now on the other hand, you also really don't need to use a car battery to light up a few leds when those d-cells will handle the load. You also don't want to be pulling the max current a supply can deliver on a regular basis and expect it not to go belly up in a short lifespan.
If you have a 5vdc- 500mA module or load, then use a supply that will handle 800mA or more, and while a 100A supply would work just fine, it's a bit overkill and a waste of money