The excess voltage is converted to heat to reduce the voltage which means you have to attach a heatsink depending on how many watts you're burning off.
(Watts) is converted to heat. Power is calculated as Voltage x Current. If you are not drawing any current, there is only a tiny amount of current used to "power" the regulator, and the regulator will remain cool.A voltage divider will typically draw more current and generate more heat,
given the same amount of current through the load, because you have current through the "bottom" resistor in the voltage divider, as well as the current through the load. If you don't want the output voltage to change very much when you add (or change) the load, you have to use lower-value resistors and draw more current through the voltage divider. Typically, you want 10 times as much current through the voltage divider as through the load.
That's fine with "signals", but it's very-bad for power supplies!
If you "cheat" and use the load as the bottom resistor, the top resistor will generate the same amount of heat as a regulator. This can be done in rare cases, such as with an incandescant lamp, but it usually wastes too much power and generates too much heat.
And if the load changes (such as an LED turning-on, etc.) the voltage will change. (The voltage is unregulated.)
A voltage regulator has two kinds of regulation... There is line
regulation, which means that you'll get 5V out of your 5V regulator, even when your 9V battery gets weak and drops to around 6V. And, there is load
regulation, which means that the 5V doesn't change when the load current/resistance changes (as long as you stay within the specs).
A switching regulator
is more complicated than a linear regulator, but they can be almost 100% efficient, and generate very little heat. You generally get more current out than you feed-in. So the power-out (voltage X current) is nearly to the power-in (voltage x current) and very little power is wasted as heat.