That is a question basically the this whole thread is about.
If I read correctly< what you said, that is the same. The mean current is
Im = I_on * T_on/T_periode
The mean power
Pm = P_on * T_on/T_periode = (U*I_on) * T_on/T_periode = U * (I_on * T_on/T_periode) = U*Im
BUT: A switching power supply has different efficiencies (GrumpyMike, I would still like to know why that is a wrong use of the word) at different loads. So a 100W supply that has a load of 70% for 10% of the time will most probably draw less overall power than a 100W supply that has a load of 7% for 100% of the time.
If the LEDs switch on and off slow enough, the first situation will be the case. If they get too fast, however, the supply will not be able to resolve the on/off states and see a constant 7% load, at a worse efficiency.
My first experiment showed that for the supply I have, PWM frequencies of a few hundred Hertz are fine. After that, I am not sure if the logic-level MOSFET couldn't follow because of the gate capacity, or the supply became less efficient.
For all consideration regarding PWM vs current-limiting by resistor, read my summary in post #40. Short version:
Efficiency is the same, current-limiting might lead to a better distribution of heat, but may make color balancing of multiple RGB strips quite difficult. That is why I will go with well-cooled, PWM-controlled strips for now.
Even better would be voltage control down to the maximum desired brightness, if that is much smaller than 100% duty cycle in PWM.