That sounds constructive to me. Personally I would rather spend time building projects to do useful or at least fun stuff rather then learning how to program the AVR in assembly language. C/C++ gives you all the low level control you or the chip will ever need and the optimizer in the gcc compiler is said to generate code as efficient as required or better then all but the most experienced assembler programmers.
Personnally fully support the idea to "leave the low level for when it's really needed". Especially at the beginning, it's far more gratifying to be able to make things work, and try new stuff than banging one's head on the wall coz' you don't understand where it fails.
I guess this can be more a general gcc question, but do you have an idea of how efficient gcc really is ? I read somewhere else that the "digital read" function of Arduino was about 20 instructions, while it's just a "bit check" in terms. Of course maybe this doesn't come from the compiler.
I understand that a #define as a "compiler directive" takes zero instruction.