Go Down

Topic: Digital illiteracy? (Read 18351 times) previous topic - next topic

AWOL

#30
Jun 06, 2011, 09:13 am Last Edit: Jun 06, 2011, 01:57 pm by AWOL Reason: 1
The Rule 101 discussion is interesting - even the conjecture post-dates my graduation by some years, so I guess it isn't surprising I hadn't heard of it!

Like literacy, (IMHO) you can take study of the subject too far, and spoil the enjoyment (I always loved reading, but I hated picking books apart when studying English literature).
I studied AI for my degree, but its applications in subsequent employment were limited, so I never really kept up-to-date with whatever flavour of AI was popular.

Again, sometime the professional societies carry the academic aspects a little too far - I grew tired of editions of the BCS journal discussion of aspects of the Tower of Hanoi problem over many months.

Get your hands dirty early and often.
If the esoteric stuff interests and excites you, all well and good.

focalist

#31
Jun 06, 2011, 08:23 pm Last Edit: Jun 06, 2011, 08:38 pm by focalist Reason: 1
But honestly, theory is often of little use in real-world situations, in that although it is realized there are fundamentals and ideals in a particular design, there are often more important real-world constraints when actually making something that works.  It's often a balance, in my opinion.. one can be mired in details so much that you can't see the forest because there's too many trees in the way.

After all, isn't Arduino itself a simplified (and thereby in some ways crippled) version of Microcontroller development?  I am POSITIVE there are purists who scoff at the entire concept of Arduino, even those that use the exact same hardware. Despite the scoffing however, Arduino has brought more people to microcontroller development along with Parallax and Microchip... other "simplified" microcontrollers.. than any other effort.  Without the simplification, the target audience (NON-experts) simply find it more daunting than entertaining.  It's not even a question of "Can" it's a question of "Want to". 

Even as a programmer, I did high-level database work mainly.  That really doesn't translate to microcontrollers (or Turing Engines) very easily.. and from a hobby perspective probably would have walked away after loading/licensing AVR Studio incorrectly for the fourth time.  Sure, if I were motivated enough, I'd do it.. but the process would be too much of a PITA to be entertaining.  Additionally, a decade's worth of professional software engineering the real world never presented me with a single situation in which that level of dissection was needed.  I'm not saying you shouldn't be aware it exists.. but it is hardly necessary to understand electronics to write Oracle stored procedures.  I think that's why you see the kind of split you do- once in the field, much of the theory becomes essentially worthless, when you have to deliver a chunk of working code by Friday..

So, can't it be said Arduino is the very devil of which you speak?

If it's a race to the "bottom" in terms of technology, I've got a bucket of sand and a candle!


Onions




Oops, yes it should. I should have put

----------F-----
           E
--------D-------
        C
------B---------
     A
---G-----------
  F
-E-------------
D



Yes, for treble clef. I don't remember bass clef any more. :~ Then, there are several other clefs on top of that. :)


I never knew bass clef, although I did know of its existence. I did not realise that there were so many other clefs too though!


Onions.
My website: http://www.harryrabbit.co.uk/electronics/home.html Up and running now! (Feel free to look round!) :D

mowcius

Quote
After all, isn't Arduino itself a simplified (and thereby in some ways crippled) version of Microcontroller development?  I am POSITIVE there are purists who scoff at the entire concept of Arduino, even those that use the exact same hardware.

Yep.

We need some people who understand the 'inners' of electronics but it it's more a shortage of generally computer literate people. Not just for computer/programming/electronics based jobs but for every job - people should know how to plug in a power cable and turn on a computer (ie know that the power for the screen does not turn on the computer etc...). They should not need to ring tech support every time a message pops up telling them the virus database has been updated becuase they don't know what a virus (or database) is and think the message might be bad.

focalist

#34
Jun 06, 2011, 09:48 pm Last Edit: Jun 06, 2011, 10:07 pm by focalist Reason: 1
Well (getting ready to duck), consider Apple.. at least since Mac.  Yes I know what's under the hood.  Not the point.. but in some ways it IS the point.  The end user DOESN'T CARE that under the hood the current Mac OS is actually a form of UNIX.  In fact, most Mac users specifically choose Mac because it completely removes them from the "computer" side of things and provides them with an Appliance.  I do apologize to the Technical mac users out there.. and there are plenty.. but even they must admit they are the exceptions and NOT the rule.

If that's what you are looking for, they deliver it.  Sure, it drives the techies crazy (Oh... the wasted CPU cycles.. bloatware.. argh..); but if you are say a Graphic Artist, you don't want a Computer.. you want a Graphic Arts Machine.  If all the tech whining in the world were blaring in that Graphic Artist's face- he wouldn't care, if the box in front of him let him draw whatever it is he wanted to draw.  It's because of this that Apple held that market... the Artists didn't care if it wasn't an ideal application of technology- they only cared that it did a relatively simple task very well with little pain on their part.  The PC world in many cases (at the time) was more difficult technically for someone to implement, it wasn't "plug and play" like Apple's closed architecture was.  Apple wasn't "Better Technically" at the time, they were instead "Techincally Better" because a NON-IT end user could build a working network successfully, even if it wasn't the most efficient.  Windows is a little less elegant, but no less an attempt at abstracting away all the painful details to simplify a user interface.  For that matter, unless you take it (as a programmer) down to the level of bits and gates, you are "dumbing down" and by that measure would be "noob".  I've never written Global Failure and Replication routines to "four nines" (99.9999% uptime) in octal machine code- but I was able via "Dumbing It Down" to do it fairly painlessly in Oracle and some creative scripting.  It's a question of using the right tool for the job.

The hope is that there's enough folks out there that by their very nature (like us) can't leave well enough alone and just HAVE to stick our fingers in the pudding.  We must realize and admit that this is an illness.  Only the best type of illness mind you.. but an illness nonetheless.

Come on, get up, get down with the sickness..... ;)

AWOL

Quote
(Oh... the wasted CPU cycles.. bloatware

Oddly, that's what I hear most people say about Microsoft OSs.

focalist

#36
Jun 06, 2011, 10:00 pm Last Edit: Jun 06, 2011, 10:14 pm by focalist Reason: 1
At what point do you call anything more than binary gates Bloatware?  It's >> ALL << bloatware.

Arduino is bloatware.. C++ is bloatware.. C is bloatware.. Assembler is bloatware.. Compiled code of any sort * IS * bloatware.  Unless you code per clock cycle manually at the lowest level, everything (and I *do* mean evrything) is bloatware.

liuzengqiang

The point here is not "mastering programming for everyone", more of "a survey of what you can do with computers if you are allowed to control it with your own programs". I think lots of visual artists have their eyes opened when they were introduced Processing, simple and elegant way to do lots of visual stuff without photoshop or what not. They even start learning kinematics to make things fall or bounce more realistically, sometimes very spontaneously. I don't see many of my students learn kinematics in physics classes though. If they know that there is a wonderful world inside of every computer, the decision is up to them to decide whether to explore it or not. Human nature may just dictate the yes answer. Simply locking it up like apple or ms for the sake of less customer service calls or what not is wrong.

Imagine you have 5 fingers and you are for your own safety wearing a pair of mittens, well, as a baby (maybe to prevent yourself from scratch the hell of your itchy skins). Then after years of constraint, you don't even know you had the freedom to move those 5 fingers independently but pick up food like an animal. This is the reality of years of dumbing the public down on computer literacy. Nobody knows how a computer works and what you can do with it (other than facebook or twitter). All they care is what they see and what others do with computers.

Grumpy_Mike

Quote
At what point do you call anything more than binary gates Bloatware?  It's >> ALL << bloatware.

No bloteware starts as soon as you get features you don't want or need.

Thanks about the Rule 110, I have never come across that. In my defense I never did any formal study of computer science basically I am an Electronics engineer / physicist.

Quote
The end user DOESN'T CARE that under the hood the current Mac OS is actually a form of UNIX.

True but then I didn't think this was about end users.

mowcius

Quote
and I *do* mean evrything

As are vowels :D

mowcius

Quote
you don't want or need.

Bit like PPI then ;)

focalist

#41
Jun 06, 2011, 10:33 pm Last Edit: Jun 06, 2011, 10:41 pm by focalist Reason: 1
well, I guess it depends upon what you call an "End User", doesn't it?  We as Arduino enthusiasts wouldn't be considered an "end user" by most, but to Atmel's engineers we may very well be considered End Users.  I'd consider myself an "End User" of Arduino, but if I make something, and someone uses that item.. I'm not the end user any more, am I?  Atmel is an "End User" to whoever it is that makes their Silicon wafers.  That company is an end user of a Sand company ;)   If there's a single person here who builds their own processors from sand, then I guess they can hold court over the rest... but otherwise.. it's all bloatware and someone-else's-end user.

Somewhere, there is a mailman that delivers his own mail.  I know it.  All I need to do to fill in that hole over there is to take a bunch of dirt from somewhere else.... isn't that Cervantes?

AWOL

Quote
Assembler is bloatware

Almost all assembler is 1:1 source to object; how can that be bloatware?

focalist

#43
Jun 06, 2011, 10:55 pm Last Edit: Jun 06, 2011, 11:05 pm by focalist Reason: 1
Because it DOES incorporate Macros, usually.  I'm not saying the amount is large, but it is not non-existent for (most) assemblers.  It's a simplification at one level or another, heck I suppose you could claim opcodes are bloatware.  If I write a program in Machine Language, there will still be opcodes I won't use.  They won't be needed for my project.  My GOD, there's TEN TIMES as many opcodes as my project needs, what a chunk of bloatware, huh?  :)  Have you ever considered how much Bloatware making more than 2n3904 transistors since the start of time is, considering that's what I've used for almost all of my projects?  Geez, the semiconductor industry is nothing but bloatware...lol.  Yes that's hyperbole, but it's to illustrate.. one person's bloatware is another's required core function.

The point remains that there really isn't a "bottom" level of technology if you want to look closer.  Anything we do is built upon something else..  that's a goodness.  I don't want to invent Transistors before I can read my news blogs...  patterns within patterns within patterns.  Gee, sounds an awful lot like cellular automata... a mathematical abstraction, huh?

AWOL

I think we have a disagreement about a definition of "bloatware".

To me, bloat is a massive set of class libraries which may improve apparent programmer productivity, but which results in unnecessary amounts of imported code, or over-large stack frames, or mechanisms hidden from the user that allocate resources "just-in-case".

Go Up