the LED will die!
I'm fed up with people constantly telling people they must use a resistor with ALL LED's, this is not true, once we start dealing with high powered LED's what's the point of the wasteful resistor?!
take for example this, an LED that handles 1amp at say 4v (your power supply is rated 2amps at 5v) now let's give a realistic value, let's say 800ma
5/0.800 (yeah yeah, i'm not including the forward voltage drop for a reason, this LED does not exist, but please do add to the calculation if you must, a 3watt resistor?) =6.25ohm resistor
4 watt resistor.
oh come on now, why?! all one has to do is look at the datasheet, and you'll see a nice curve (voltage / current) and you'll see there's plenty of scope unlike a 5mm regular 20ma LED.... providing you set a precise voltage or limit the current (but not with a 4 watt resistor)
the LED will be perfectly fine...
(Wait's for the.... "at the moment" remark) it really has to stop, if you use 9v on a 5v appliance would you expect magic smoke? probably so why would you expect every man and his dog to over drive an LED and supply a higher voltage than it can handle!, would you supply 9v directly to an atmega? so why can't you trust a person supplying exactly 2.8v or 3.2v why obsess!
- Please go look up "Direct Drive" running high powered LED's directly from an unregulated 4.2 lithium battery, they're all FINE....
So why why why why WHY must people KEEP telling others to use resistors with high powered LED?! what next, oh it's a 30watt LED, make sure to use a 15watt resistor to drive it at half power!