So, first of all I am confident this has been answered before, but I haven't found an efficient way of really searching through the forums. Just know that I HAVE attempted to find the answer on my own before asking a trivial one on the forum.
Anyway, I'm a computer engineer and am in the process of prototyping a senior project. As a more software oriented person, I've never really actually hooked up micro-controllers with other chips before, and I'm a bit confused as to how power supply works. Here's my best guess so far of how I should be setting this up:
We have the arduino mega, a WiFly (not the shield, just the straight up chip) a GPS, and a 128x64 LCD. The WiFly requires 3.3V and the LCD wants a minimum of 6V, and the GPS wants 5V.
If I get a power supply for say, 12V, and plug it into the arduino, it is my understanding that the Vcc pin will output 12V dc, the 5V pin will output 5V, and the 3.3V pin will output 3.3V, and connect all the grounds to the common ground pin on the board.
Though, I think it won't work to power the devices off this because of current draw, and this is where my knowledge becomes shaky - it appears, for instance, the 3.3V can supply 30mW but our WiFly draws more like 200. Does the Vcc pin supply the full current of the power supply, or should we run the power supply into a breadboard and power everything off that rather than into the mega and powering everything off the mega?
Thanks in advance - I appreciate it