Fair enough. Nobody is allowed to complain. Let's keep it exactly as it is and always has been.
Look, I don't believe you. Otherwise why aren't we all now programming in 6800 or 8080 assembler or something like that? There is no linearly proportional distance to some “wall” that is inevitable that it will be hit no matter what. Things can be made easier by removing obstructions, and the work is still done and the objective is still achieved.
By the way, what is wrong with people's comprehension lately? A language that is easy
is not the same as a language that is simple
. I'm not suggesting simplify things at all, but I'm strongly pointing out that it needs to be easy. An easy language to use can be as powerful as a complex and irritating and uncomfortable language. There are a lot of things that were more complex, ornate, flowery and decorative the way they used to be done in history, but have whittled down to a more streamlined, elegant and frankly easier way of operating. I could think of examples, but I'm sure anyone reading this could do so also.
I know from observational experience that some languages are not going to be rewarding or productive for novices in a limited time, whereas some other languages are. Considering that the task of programming is just one of the things that a student has to contend with to get a creative art project designed and constructed, having to spend most of the limited amount of time on just the programming part essentially sabotages the other parts of the production. Yet it is usually in those other areas that they will excel given a chance if it were not for this alien programming phase taking far too long and producing next to useless results every time. It basically puts people off ever having to get this deeply into computers ever again, and they'll typically refuse to risk ever getting involved with technology ever again. It's less risky and more productive to just produce something acceptable that doesn't have a microprocessor sitting there in the middle doing nothing useful except costing a lot, because the bloomin' thing doesn't fulfil the potential it promised due to a far from streamlined approach phase.
There's more to do on a project than spend time guessing the right words and in the right order and spending weeks tracking down problems such as daft syntax errors. A more friendly language would allow them to get to the stage where they're spending weeks tracking down problems such as fundamental action or process design misconceptions instead because computers solve tasks in a way that no human being (well, no fashion designer) would have guessed in a million years. I think that it's better to rush ahead to the stage where you have to be thinking “how would I do this if I were a microprocessor” than be stuck at the early stage of “say the magic word or you'll just get another page of syntax errors” for several months.
In other design domains, with a different set of people, I've seen people produce usefully gratifying results in an afternoon with Ruby* to the point that over the next few times they are in a position to make their own modifications and largely they'll sort of work (although usually not at all in the way they thought they would). I've also seen people spend an entire semester learning to hate Java because all they see are syntax errors, nothing they imagine will work actually ever does, and there's just too much fancy ornamentation to remember to get right each time. I've seen the same people a few months later get introduced to PHP and rush straight ahead actually producing things that sort of work.
The more I look at Lua, lately, thanks to several suggestions, the majority of them (but not restricted to) on this forum, the more I'm convinced that this would be a fairly good launch phase language, where that is the first syntax the novice sees. Although it's alien, it's not 100% martian, and at least some of what they'll try to do themselves beyond (or beside) the given examples in tutorials might stand some chance of producing some evidence of working.
Could (theoretically) a layer of Lua itself drive the C compiler that pushes GCC to make AVR binaries, and still work within an expanded Arduino IDE? Note, I'm not talking about eLua (http://www.eluaproject.net/
) which itself seems a good project. (Incidentally, I'm starting to wonder if Forth running on the microcontroller is actually the good idea I'd always assumed it was). I'm now thinking it is better to do it the way it currently is performed — on an IDE on a MacBook (or whatever else you use) and a toolchain that emits AVR binary and induces it to be on the Atmega chip somehow in a predictable fashion. If there were a Lua option and a C/“Arduino classic” option, neophytes could start off with Lua and progress further. If this wall of yours appears, they've had more productive experience at that point, and are in more of a state of mind to tackle it. I'm not suggesting that Lua is some magic potion that solves all the problems, but it's fairly clear to me that it does go some way to solving a significant problem and that is that frankly C is too difficult
--*Of course, at one point I was strongly wishing that Arduino could be instructed using Ruby, but on second thoughts, Ruby is highly object oriented — that's one of its strengths — and physical computing is probably one area where object orientation turns out to be simply an irrelevance.