Digital illiteracy?

After all, isn't Arduino itself a simplified (and thereby in some ways crippled) version of Microcontroller development? I am POSITIVE there are purists who scoff at the entire concept of Arduino, even those that use the exact same hardware.

Yep.

We need some people who understand the 'inners' of electronics but it it's more a shortage of generally computer literate people. Not just for computer/programming/electronics based jobs but for every job - people should know how to plug in a power cable and turn on a computer (ie know that the power for the screen does not turn on the computer etc...). They should not need to ring tech support every time a message pops up telling them the virus database has been updated becuase they don't know what a virus (or database) is and think the message might be bad.

Well (getting ready to duck), consider Apple.. at least since Mac. Yes I know what's under the hood. Not the point.. but in some ways it IS the point. The end user DOESN'T CARE that under the hood the current Mac OS is actually a form of UNIX. In fact, most Mac users specifically choose Mac because it completely removes them from the "computer" side of things and provides them with an Appliance. I do apologize to the Technical mac users out there.. and there are plenty.. but even they must admit they are the exceptions and NOT the rule.

If that's what you are looking for, they deliver it. Sure, it drives the techies crazy (Oh... the wasted CPU cycles.. bloatware.. argh..); but if you are say a Graphic Artist, you don't want a Computer.. you want a Graphic Arts Machine. If all the tech whining in the world were blaring in that Graphic Artist's face- he wouldn't care, if the box in front of him let him draw whatever it is he wanted to draw. It's because of this that Apple held that market... the Artists didn't care if it wasn't an ideal application of technology- they only cared that it did a relatively simple task very well with little pain on their part. The PC world in many cases (at the time) was more difficult technically for someone to implement, it wasn't "plug and play" like Apple's closed architecture was. Apple wasn't "Better Technically" at the time, they were instead "Techincally Better" because a NON-IT end user could build a working network successfully, even if it wasn't the most efficient. Windows is a little less elegant, but no less an attempt at abstracting away all the painful details to simplify a user interface. For that matter, unless you take it (as a programmer) down to the level of bits and gates, you are "dumbing down" and by that measure would be "noob". I've never written Global Failure and Replication routines to "four nines" (99.9999% uptime) in octal machine code- but I was able via "Dumbing It Down" to do it fairly painlessly in Oracle and some creative scripting. It's a question of using the right tool for the job.

The hope is that there's enough folks out there that by their very nature (like us) can't leave well enough alone and just HAVE to stick our fingers in the pudding. We must realize and admit that this is an illness. Only the best type of illness mind you.. but an illness nonetheless.

Come on, get up, get down with the sickness..... :wink:

(Oh... the wasted CPU cycles.. bloatware

Oddly, that's what I hear most people say about Microsoft OSs.

At what point do you call anything more than binary gates Bloatware? It's >> ALL << bloatware.

Arduino is bloatware.. C++ is bloatware.. C is bloatware.. Assembler is bloatware.. Compiled code of any sort * IS * bloatware. Unless you code per clock cycle manually at the lowest level, everything (and I do mean evrything) is bloatware.

The point here is not "mastering programming for everyone", more of "a survey of what you can do with computers if you are allowed to control it with your own programs". I think lots of visual artists have their eyes opened when they were introduced Processing, simple and elegant way to do lots of visual stuff without photoshop or what not. They even start learning kinematics to make things fall or bounce more realistically, sometimes very spontaneously. I don't see many of my students learn kinematics in physics classes though. If they know that there is a wonderful world inside of every computer, the decision is up to them to decide whether to explore it or not. Human nature may just dictate the yes answer. Simply locking it up like apple or ms for the sake of less customer service calls or what not is wrong.

Imagine you have 5 fingers and you are for your own safety wearing a pair of mittens, well, as a baby (maybe to prevent yourself from scratch the hell of your itchy skins). Then after years of constraint, you don't even know you had the freedom to move those 5 fingers independently but pick up food like an animal. This is the reality of years of dumbing the public down on computer literacy. Nobody knows how a computer works and what you can do with it (other than facebook or twitter). All they care is what they see and what others do with computers.

At what point do you call anything more than binary gates Bloatware? It's >> ALL << bloatware.

No bloteware starts as soon as you get features you don't want or need.

Thanks about the Rule 110, I have never come across that. In my defense I never did any formal study of computer science basically I am an Electronics engineer / physicist.

The end user DOESN'T CARE that under the hood the current Mac OS is actually a form of UNIX.

True but then I didn't think this was about end users.

and I do mean evrything

As are vowels :smiley:

you don't want or need.

Bit like PPI then :wink:

well, I guess it depends upon what you call an "End User", doesn't it? We as Arduino enthusiasts wouldn't be considered an "end user" by most, but to Atmel's engineers we may very well be considered End Users. I'd consider myself an "End User" of Arduino, but if I make something, and someone uses that item.. I'm not the end user any more, am I? Atmel is an "End User" to whoever it is that makes their Silicon wafers. That company is an end user of a Sand company :wink: If there's a single person here who builds their own processors from sand, then I guess they can hold court over the rest... but otherwise.. it's all bloatware and someone-else's-end user.

Somewhere, there is a mailman that delivers his own mail. I know it. All I need to do to fill in that hole over there is to take a bunch of dirt from somewhere else.... isn't that Cervantes?

Assembler is bloatware

Almost all assembler is 1:1 source to object; how can that be bloatware?

Because it DOES incorporate Macros, usually. I'm not saying the amount is large, but it is not non-existent for (most) assemblers. It's a simplification at one level or another, heck I suppose you could claim opcodes are bloatware. If I write a program in Machine Language, there will still be opcodes I won't use. They won't be needed for my project. My GOD, there's TEN TIMES as many opcodes as my project needs, what a chunk of bloatware, huh? :slight_smile: Have you ever considered how much Bloatware making more than 2n3904 transistors since the start of time is, considering that's what I've used for almost all of my projects? Geez, the semiconductor industry is nothing but bloatware...lol. Yes that's hyperbole, but it's to illustrate.. one person's bloatware is another's required core function.

The point remains that there really isn't a "bottom" level of technology if you want to look closer. Anything we do is built upon something else.. that's a goodness. I don't want to invent Transistors before I can read my news blogs... patterns within patterns within patterns. Gee, sounds an awful lot like cellular automata... a mathematical abstraction, huh?

I think we have a disagreement about a definition of "bloatware".

To me, bloat is a massive set of class libraries which may improve apparent programmer productivity, but which results in unnecessary amounts of imported code, or over-large stack frames, or mechanisms hidden from the user that allocate resources "just-in-case".

Well, when you produce the compiler that produces 100% hardware optimized, portable code you let me know, and we will both be Billionaires, okay?

Nobody, EVER, has done it yet. Once you look at what a compiler... a General Purpose compiler.. really has to do, provide both flexibility and a stable framework.. yet simple enough to work in abstracts.. it can NEVER really be optimal. It can get dang close, might even LUCK upon it.. but compiled code IS macros written by someone else, strung together... every single inefficiency in their code becomes your code. Unless you trust that the compiler authors absolutely got every code optimization right in THEIR code, you can't trust that your compiles to the most efficient, even if you assume the "perfect" compiler and optimizer. You sacrifice efficiency for the ease of use... and even then, the programmers of the compiler may choose a less efficient method of doing a particular task simply because they were taught a particular theorem, or even just because doing it "n" degrees better will take "n^35" effort.... If code optimization was easy, compilers wouldn't exist and Interpreters would be just as efficient, minus reading the code's bytestream from memory. In terms of actual storage space for the application code, tokenized statements (10 Print "Hello World") are frequently smaller than their compiled code running on the CPU. It's again just a matter of choosing the right tool for the job. Screwdrivers are really useful, but there are things you can't do with a screwdriver. Well, maybe you could threaten to poke someone in the eye with a screwdriver to force them to give you the actual tool you need, and get it done that way, but you get the idea.

Shell script is better than a poke in the eye with a screwdriver. Barely.

Anyone who laughs at Visual Basic has never done any real world coding... show me a professional that doesn't consider PERL an indispensible tool. Very few thing are as inefficient as Shell Scripting, but try and live without it. These are all Interpreted languages, at BEST, VB compiles into P-code. "HAL" - the Hardware Abstraction Layer - of any OS is pretty much where application programming begins.. so by definition, any software is already "bloatware" by nature of an OS. An interpreted programming language is horrid from CPU efficiency standpoint- but when you need it to simply WORK, "Bloatware" isn't bloat at all, if it provides the only REASONABLE way to perform a task. MAYBE I could walk to California, but I'd be foolish not to take a plane instead, despite a 747's poor fuel economy.

Theory would have you chase down every clock cycle. Practice makes you realize the clock cycles aren't important if you can't make it do what you need with a reasonable amount of effort. I'll trade a little efficiency for a working product, rather than have the most efficient design ever, that never gets built..

I'm afraid the guys at ARM have beaten us to the billions.

(bloat isn't a few odd cycles here and there)

ARM is really going to take off with their new dual core chips and windows 8 supporting it.

focalist:
Yes that's hyperbole, but it's to illustrate.. one person's bloatware is another's required core function.

I'd like to meet the person for whom animated desktop switching, by displaying a rotating cube which quivers like jello while it's moving the new desktop into position, is a core function.

focalist:
Unless you trust that the compiler authors absolutely got every code optimization right in THEIR code, you can't trust that your compiles to the most efficient, even if you assume the "perfect" compiler and optimizer.

Well, it was years ago, but what I remember hearing was that it took an exceptionally talented Macro-32 programmer to write better assembly than the Fortran compiler produced under VMS. (VMS was at 3.7 when I heard that. We were running a 750 with 4MB of RAM, and an Ingres [version 2.n] database.)

Screwdrivers are really useful, but there are things you can't do with a screwdriver.

Yeah, but add a hammer and now you're ready for 99%. :smiley:

Anyone who laughs at Visual Basic has never done any real world coding

I despise VB. I've laughed at it a whole number of times. Lots of real-world coding under my belt. (And yes, I've unfortunately had to write VB for work too.)

... show me a professional that doesn't consider PERL an indispensible tool.

You know, I write a bit of Perl here and there, but Bash, awk and grep, and pipes, get a lot of things done. Of course, YMMV, but I haven't found Perl to be indispensable. I'm sure a lot of Windoze programmers don't use it at all.

I tried learning perl many years ago but really didn’t understand it. After many wasted months, nothing at all worked and I had no idea why it didn’t. The bastards on irc were the absolute opposite of helpful.

IRC can certainly be hit/miss. I find Perl to be both useful and frustrating. However, that's unsurprising given my personal idiosyncracies. In the Camel book, Larry wrote to the effect of "write what you think should happen, and Perl will probably do it", or something like that (the book isn't ready to hand here at the moment). Well, that never worked for me, even when I thought I was following the example code pretty well.

This thread got me to remember that classic of programming wisdom, Real Programmers Don't use Pascal.

Well Ian, I agree in some respects with your tagline.. in fact, it exemplifies the issue: How much simplification is enough, how much is too much? The question I'll put in return is "Well, just how far should the simplification go? BASIC? LOGO?" I'm not being facetious, what would in your opinion be a level of abstraction that would work for someone like you? In reality, another layer of Macros is all that's needed, wrapper more of the lower level functions, cut down the I/O options, and there you go. LEGO NXT must be something like that, I imagine (I've never used it). At compile time, those simplified macros get replaced with Arduino statements, which then are translated to C, which are then translated to machine code by GCC. It's just another layer of the onion. (Again, I'm simplifying for the sake of clarity, compiling to Arduino before C would just be silly.. but sillier things have been done..)

One of the nicer features of Arduino as a platform is that it doesn't limit you to the "easy" layer, like most C derivatives, it allows for inline code in lower level, without detracting from the simplicity. Complex things can be wrappered as Libraries; and slapping in a bit of register calls, port manipulations and assembler isn't hard. Most will never use the lower level, but it's there if we want it.

I've not yet taken the time to walk through the assembler that GCC makes from C made from Processing made from Arduino. I'll suspect that it's less than optimal.. but it works. For hobby purposes, I think it may be more important that it makes these complex tasks implementable by a non-expert. Over time, what has happened time and again is that it's not necessarily the Best technology that wins, it's the Most Accessible.. Remember DEC? Ken's fatal mistake- being ahead of his time. He was selling Cloud Computing in the 80's. He knew that High-powered desktop PC's would be mainly a fad and that the enduring tech would be a feature-rich thin client. We call that the Web. Virtually all of the "work" being done these days is on the server side, so you can support low-powered interfaces like a cell phone. He was right, just wrong on how long it would take before only Techies had any real horsepower on their desk.. and most people would be happy with an easy-to-use thin client. Even the souped-up gaming systems are really only trying to get a little more out in terms of video framerate (aka cute interface), the actual "logic" of the games barely ticks over a CPU, comparatively. All that power is just for making a PRETTY thin client.. not computing the logic of "hit monster five times for ten points of damage". Very little COMPUTING is done on desktops... in fact, I probably do more than most, because Photoshop is computationally intensive, and I do hi-resolution photo editing.. something that DOES require large amounts of bit planes and matrix math at ridiculous levels, FAR more than any game. I don't need to render fast... I need to process image data that uncompressed is fifty to a hundred megabytes for a single picture.. and each pixel needs to be recalculated based upon it's neighbors.. then adjusted for a dozen different parameters.. aligned.. error corrected.. and THEN you can start actually changing things. that's real computing power, even if we don't think of it that way.