Ya'll folks who seem to "talk hex & binary" amaze me with the knowledge of inter workings of code stuff.
Computers are the most complex machines mankind has ever created. With that said, at their core, they really are nothing more than very fast player pianos with really long song reels.
Code is nothing more than an abstraction of those holes in the song reel (addresses and data bytes) to something more "human readable". All a compiler does (this is a simplification, of course) is take those human readable bits, and translates them into a series of binary representations which are, at heart, on-off switches connected via a very complex arrangement of NAND/NOR gates (because any and all logic may be represented by combinations of such) that describe a virtual wiring network, one which can change from one state (or machine/cpu cycle) to the next.
One place where it gets really fun is microcode - this is code used internally by the processor, programmed at the factory; an abstraction on an abstraction, if you will.
Wanna know the crazy thing? We could've had calculating machines (not necessarily computers as we know them; it would take the abstraction theory work of Church and Turing to make the mental shift away from calculation to that of general purpose symbol manipulation a reality, something which wasn't really being thought about in Babbage's time, though I bet a real system like his would have spurred it, much as it did for both Church and Turing in the 1940's) nearly one hundred years prior to when we did have them, had Babbage chose to use relays (and binary logic) instead of gears (and decimal notation, which was more suitable to mechanical implementation). Don't get me wrong, Babbage's machine could've brought us a great leap forward had there been the financial backing for it (among other reasons); it was certainly a Turing complete device. However, Babbage would've had an easier time implementing his ideas had he used Boole's binary notation and logic system (something Babbage would've had knowledge of; he did hold the Lucasian chair for a while, after all), along with relay logic. The relays existed, the power sources also existed (batteries, at a minimum). Space would've been an issue, but the Engines he designed were already behemoths, had they been built. The relays could've been miniaturized as well. I can speculate why he didn't do it; it would seem most likely the thought never occured to him (calculation with electricity would have to wait for Herman Hollerith and the 1890 US census, which ultimately begat IBM), or if it had, he couldn't see how the size of a relay could be made smaller (though that would seem unlikely), or that relays were very unreliable at the time (a possibility - most relay-like devices were used for telegraph usage as sounders; it wasn't until the latter half of the 1800's that relays were used for switching and other purposes, especially with the advent of multiplexed telegraphy, stock tickers, alarms, and telephones).

Most of the complexity of code is only in your mind; high-level C/C++ code is really a simplification of what goes on down at the metal (if you have ever performed hand assembly at the hex level, you know what I mean - several old-timers here, I know, do, actually). Could things be made simpler? Of course they could, but you can only get so simple before you get to a point where it becomes impossible to do complex things.
