Digital illiteracy?

I’ve never had to actually read music, and indeed, managed to somehow pass music at school without ever successfully reading it (I knew a lot about synthesizers, it was 1977, the music teachers were quite old-generation, and they probably thought I knew more than I did about music itself).

But, and this is relying on memory of my old Yamaha CX5m Music Composer cartridge I used to try and use in 1985…

Haven’t you got the lines and spaces assigned to the wrong notes? Shouldn’t it be FACE for the spaces, and EGBDF for the lines, reading from the bottom line up?

Oh, and that’s another thing — my natural assumption was shattered surprisingly once when I realised that music is the wrong way round. I’d naturally assumed that the bass frequencies are spatially analogous to “here, near my forehead” and as the frequency increases, it increases in the direction of “toward my toes”. That feels right to me. But no, music is the wrong way round — increased frequencies go the wrong way — up. Same as the piano keyboard — it’d be more natural to me to have the bass at the right end, the treble at the left end, and each time I plinky plonk at a piano keyboard it somehow surprises me that it’s the wrong way round.

Very lively discussion, with music on the background ;)

Let me chip in some stuff I've been thinking since I saw the first post.

1) There are too many digital illiterates compared with how many computers on the planet

2) Computer/digital literacy should be a part of high school/college general education classes (too bad only I and a CSCI professor on my campus would probably vote yes), like there are English literacy classes on composition. Using computer programs is not the complete literacy, like someone can only read but not write, basic programming skill is needed.

3) Programming languages should be taught like other languages, with lots of reading of short to medium length classic programs and find errors on some reading and appreciating the beauty of other reading. I recall I did this accidentally when I got my 4 books of BASIC programs. There were lots of programs that are like simple games, guessing numbers etc. To a 10-year old they were perfect. Most of the editors at the publishing house didn't know BASIC so they made simple printing mistakes here and there and my brother and I were correcting mistakes as we when from one program to another.

Shouldn’t it be FACE for the spaces, and EGBDF for the lines, reading from the bottom line up?

Oops, yes it should. I should have put

----------F----- E --------D------- C ------B--------- A ---G----------- F -E------------- D

To add to the original topic, they do teach programming at our school, but it is not proper programming. We are shown how to program by drawing a flowchart on a computer and getting a big piece of complex software to turn it into code. All you need to do is draw a flowchart. That's all. Nothing else, just draw a flowchart. Then, you press F5 and it uploads the program the computer made onto a microcontroller they call a "genie chip". (Personally, I'd use an AVR :D) To me, this is not programming. It encourages problem solving, which is an important part of programming, but does not properly cover the subject in any depth. HOWEVER, there will always be people like me, and I'm sure other people on this forum, who want to know how things work, and why. It is these people that will learn by choice, take things further than would be possible in schools. Whenever there are people interested in something, there will be people willing to learn. Most people may be digitally illiterate, but there will always be people wanting to learn.

Onions.

I learned BASIC at school - took me ages to get out of that mindest. It wasn't until much later and I studied formal grammars and Michael Jackson's (no, not that one) "System Development" that I really got proper programming constructs, and mapping the problem onto software.

(Oh, and I don't consider the musical excursion as OT)

The thing is, it’s not just computers. I mean, you lot are all computer fanatics, spending most of your day up to your elbows with bytes, J-K flip-flops and conditional tests, but most people aren’t and their perspective of a computer just about equates to their perspective of a toaster. They just get on and use it, fixing it or modifying it is never even considered, just buy a new one at half the price of the old one.

It’s exactly the same in many other domains, I would suspect. On one hand I’ve been a designer since quite a few years before the advent of DTP and to hear people refer to “cut and paste” without a clue how cutting and pasting was actually done is dismaying, but that’s the reality of now. Similarly, originally when I was a professional photographer, I learned how to do unsharp masking, drop shadows, outlining, entirely with film. Now people think those terms originated with Photoshop.

Cameras are an ideal example of how the technology has abstracted away from the user. For most of the 20th century it was possible for a person who owned a camera to read the manual, get used to the concept of exposure being affected simultaneously by varying the aperture setting and varying the shutter speed, and the idea of a smaller aperture giving a greater depth of field, a wider aperture giving a narrower depth of field, a slow shutter speed picking up motion blur and a fast one freezing motion, and the film speed equating with sensitivity. It wasn’t too difficult to grasp, and many people ended up doing so until auto exposure relieved them of that (and auto focus much later). But now we have programmed scenes in digital cameras, and the things we can vary are fundamentally different to the actual mechanistic variables of the original film camera.

Now our variables – completely different variables – are varying an abstracted program, which is someone else’s idea of how the processor should process the Bayer-matrix data. Nobody is in direct control any more. If someone sets the camera to “lady with hat” or “screaming brat” or “distant mountain” or “underwater barbecue with fireworks” what on earth are they actually varying? Nobody knows. Well, no consumer knows. No professional photographer even knows. No camera expert even knows. Except the particular ones that programmed up the way that the “engine” performs the processing, and which abstractions they’ve given us to play with.

or “underwater barbecue with fireworks”

hahahahaha :D

Yeah I suppose it's similar in a load of different domains.

The issue at hand is too many young people are illiterate - it's not so much everyone needs to be it's just that those children who have the mathematical mind and are interested should get the teaching they require to improve their skills. And as some would say, only some people will benefit from that (based on my analogy) isn't true as even the people who aren't interested or good at it will improve and it'll be more use to people than french (compulsory in many UK schools) is to most people.

justjed:

Grumpy_Mike:

You ask them about Rule 110 and they stare at you blankly...

Stares back at you blankly. Is it a US thing?

Rule 110 from the domain of cellular automata. I would say it's debatable whether that's an essential part of digital (or computational) literacy. If you're getting into gaming or other world simulation work, then yes.

Rule 110 is also the simplest set of known "rules" for a cellular automata that can act as a Universal Turing Machine. I would say knowing this (and knowing what a UTM is, and why it is important to computer science) -is- an essential part of digital literacy for a computer scientist (though not necessarily for child or even an adult learning programming), which is what I was getting at.

The fact that there are people out there graduating from supposed computer science courses without understanding the importance of Rule 110, without understanding what a UTM is, without knowing the contributions of Turing (and Church, and Russell, and Whitehead, and...) to computational theory - it boggles the mind.

At one time, there was this fear that ordinary people wouldn't understand (or care) about what goes on "inside the box", just so long that it worked, and woe to the world should it break. Only the "high priests" of computing would know or understand. What I am now seeing is that the supposed "high priests" of computing now being educated, are also being educated in such a manner that even they don't understand what is going on "inside the box"! It would be like a newly educated phD in chemistry not understanding covalent bonds, instead saying "I'll leave that up to the physicists" - or some such shenanigans!

This really bothers me - computer science encompasses so much - to not understand the basics of it does a real disservice to the field. If you don't understand Rule 110 or why it is important, if you don't understand what a UTM is, or why it works... I'm finding this difficult to relate, but we're (our species) is getting into the territory of learning how DNA really works; at its core, we understand that it -is- the "tape" (indirectly via RNA and tRNA) of a UTM, and that the complexity known as the ribosome is the read/write head and interpretor. We still don't completely understand how the ribosome works, but it is likely going to take the knowledge of UTMs, cellular automate, and Rule 110 (among others) to figure it out (along with how the whole DNA coding system works to generate proteins, etc).

That's the importance of it for computer science, and for biology. We are on the cusp of altering our programming on such a fundamental level it is both exciting, intriguing, frightening and mind-boggling all at once. It makes me giddy knowing there are people out there playing with this stuff (and I sincerely wish I could be a part of it, but my understanding and education does not come close to what is really needed).

To see some schools here in the USA (I can't speak for other countries) gloss over such knowledge frustrates me; perhaps I'm being unfair, maybe this knowledge is considered in another area of the computer science curricula at these schools (though I tend to wonder why it would be broken out like this - it seems that it isn't a requirement, though it really should be - I sometimes wonder how you could expect to write an emulator, say - without understanding what a UTM is).

...sigh...

A lot of that is lost on me and I’d class myself as being rather above average in this field.

Unfortunately not being taught it at an early age has meant that I doubt I will ever really get to the knowledge of computers and chips that some off you guys are at and as technology advances, I will learn the new and still not understand what goes on inside

cr0sh: Rule 110 is also the simplest set of known "rules" for a cellular automata that can act as a Universal Turing Machine. I would say knowing this (and knowing what a UTM is, and why it is important to computer science) -is- an essential part of digital literacy for a computer scientist (though not necessarily for child or even an adult learning programming), which is what I was getting at.

The fact that there are people out there graduating from supposed computer science courses without understanding the importance of Rule 110, without understanding what a UTM is, without knowing the contributions of Turing (and Church, and Russell, and Whitehead, and...) to computational theory - it boggles the mind.

Well, I completely get your point, but I think I'm using a different definition of literacy. And I think we can use the more typical definition of literacy as an example. You can learn to read/write at a level which allows you to be functional in society, or you can become literate at a higher level, e.g. broader vocabulary, multiple languages, studying linguistics. I don't argue at all the a Comp-Sci degree should include things such as Turing-completeness, AI, and P / NP problems. I didn't attend college at all, and didn't even know about such things until I was well into a very successful career. So, was I digitally literate? I think that I was, and am. But then, I have knack for this stuff, and did things like read the PDP Architecture Handbook. Heck, I was often the 'mentor' on the software team, helping BSCS types get their heads into what real-world systems development was about. Turing-completeness is cool. It also doesn't help you a lot when you're trying to write a 1.5 page-long SQL statement. You can be literate in your problem domain area, without necessarily understanding cellular automata.

But what do hate to see is so-called programmers who don't know how to do anything other than string together snippets plopped into the "action" dialog of a GUI IDE. You might well get something that runs and does what's expected, but you don't know why. And that is a recipe for disaster in the real world.

Onions:
Oops, yes it should. I should have put

----------F-----
E
--------D-------
C
------B---------
A
—G-----------
F
-E-------------
D

Yes, for treble clef. I don’t remember bass clef any more. :~ Then, there are several other clefs on top of that. :slight_smile:

The Rule 101 discussion is interesting - even the conjecture post-dates my graduation by some years, so I guess it isn't surprising I hadn't heard of it!

Like literacy, (IMHO) you can take study of the subject too far, and spoil the enjoyment (I always loved reading, but I hated picking books apart when studying English literature). I studied AI for my degree, but its applications in subsequent employment were limited, so I never really kept up-to-date with whatever flavour of AI was popular.

Again, sometime the professional societies carry the academic aspects a little too far - I grew tired of editions of the BCS journal discussion of aspects of the Tower of Hanoi problem over many months.

Get your hands dirty early and often. If the esoteric stuff interests and excites you, all well and good.

But honestly, theory is often of little use in real-world situations, in that although it is realized there are fundamentals and ideals in a particular design, there are often more important real-world constraints when actually making something that works. It's often a balance, in my opinion.. one can be mired in details so much that you can't see the forest because there's too many trees in the way.

After all, isn't Arduino itself a simplified (and thereby in some ways crippled) version of Microcontroller development? I am POSITIVE there are purists who scoff at the entire concept of Arduino, even those that use the exact same hardware. Despite the scoffing however, Arduino has brought more people to microcontroller development along with Parallax and Microchip... other "simplified" microcontrollers.. than any other effort. Without the simplification, the target audience (NON-experts) simply find it more daunting than entertaining. It's not even a question of "Can" it's a question of "Want to".

Even as a programmer, I did high-level database work mainly. That really doesn't translate to microcontrollers (or Turing Engines) very easily.. and from a hobby perspective probably would have walked away after loading/licensing AVR Studio incorrectly for the fourth time. Sure, if I were motivated enough, I'd do it.. but the process would be too much of a PITA to be entertaining. Additionally, a decade's worth of professional software engineering the real world never presented me with a single situation in which that level of dissection was needed. I'm not saying you shouldn't be aware it exists.. but it is hardly necessary to understand electronics to write Oracle stored procedures. I think that's why you see the kind of split you do- once in the field, much of the theory becomes essentially worthless, when you have to deliver a chunk of working code by Friday..

So, can't it be said Arduino is the very devil of which you speak?

If it's a race to the "bottom" in terms of technology, I've got a bucket of sand and a candle!

justjed:

Onions: Oops, yes it should. I should have put

----------F----- E --------D------- C ------B--------- A ---G----------- F -E------------- D

Yes, for treble clef. I don't remember bass clef any more. :~ Then, there are several other clefs on top of that. :)

I never knew bass clef, although I did know of its existence. I did not realise that there were so many other clefs too though!

Onions.

After all, isn't Arduino itself a simplified (and thereby in some ways crippled) version of Microcontroller development? I am POSITIVE there are purists who scoff at the entire concept of Arduino, even those that use the exact same hardware.

Yep.

We need some people who understand the 'inners' of electronics but it it's more a shortage of generally computer literate people. Not just for computer/programming/electronics based jobs but for every job - people should know how to plug in a power cable and turn on a computer (ie know that the power for the screen does not turn on the computer etc...). They should not need to ring tech support every time a message pops up telling them the virus database has been updated becuase they don't know what a virus (or database) is and think the message might be bad.

Well (getting ready to duck), consider Apple.. at least since Mac. Yes I know what's under the hood. Not the point.. but in some ways it IS the point. The end user DOESN'T CARE that under the hood the current Mac OS is actually a form of UNIX. In fact, most Mac users specifically choose Mac because it completely removes them from the "computer" side of things and provides them with an Appliance. I do apologize to the Technical mac users out there.. and there are plenty.. but even they must admit they are the exceptions and NOT the rule.

If that's what you are looking for, they deliver it. Sure, it drives the techies crazy (Oh... the wasted CPU cycles.. bloatware.. argh..); but if you are say a Graphic Artist, you don't want a Computer.. you want a Graphic Arts Machine. If all the tech whining in the world were blaring in that Graphic Artist's face- he wouldn't care, if the box in front of him let him draw whatever it is he wanted to draw. It's because of this that Apple held that market... the Artists didn't care if it wasn't an ideal application of technology- they only cared that it did a relatively simple task very well with little pain on their part. The PC world in many cases (at the time) was more difficult technically for someone to implement, it wasn't "plug and play" like Apple's closed architecture was. Apple wasn't "Better Technically" at the time, they were instead "Techincally Better" because a NON-IT end user could build a working network successfully, even if it wasn't the most efficient. Windows is a little less elegant, but no less an attempt at abstracting away all the painful details to simplify a user interface. For that matter, unless you take it (as a programmer) down to the level of bits and gates, you are "dumbing down" and by that measure would be "noob". I've never written Global Failure and Replication routines to "four nines" (99.9999% uptime) in octal machine code- but I was able via "Dumbing It Down" to do it fairly painlessly in Oracle and some creative scripting. It's a question of using the right tool for the job.

The hope is that there's enough folks out there that by their very nature (like us) can't leave well enough alone and just HAVE to stick our fingers in the pudding. We must realize and admit that this is an illness. Only the best type of illness mind you.. but an illness nonetheless.

Come on, get up, get down with the sickness..... ;)

(Oh... the wasted CPU cycles.. bloatware

Oddly, that's what I hear most people say about Microsoft OSs.

At what point do you call anything more than binary gates Bloatware? It’s >> ALL << bloatware.

Arduino is bloatware… C++ is bloatware… C is bloatware… Assembler is bloatware… Compiled code of any sort * IS * bloatware. Unless you code per clock cycle manually at the lowest level, everything (and I do mean evrything) is bloatware.

The point here is not "mastering programming for everyone", more of "a survey of what you can do with computers if you are allowed to control it with your own programs". I think lots of visual artists have their eyes opened when they were introduced Processing, simple and elegant way to do lots of visual stuff without photoshop or what not. They even start learning kinematics to make things fall or bounce more realistically, sometimes very spontaneously. I don't see many of my students learn kinematics in physics classes though. If they know that there is a wonderful world inside of every computer, the decision is up to them to decide whether to explore it or not. Human nature may just dictate the yes answer. Simply locking it up like apple or ms for the sake of less customer service calls or what not is wrong.

Imagine you have 5 fingers and you are for your own safety wearing a pair of mittens, well, as a baby (maybe to prevent yourself from scratch the hell of your itchy skins). Then after years of constraint, you don't even know you had the freedom to move those 5 fingers independently but pick up food like an animal. This is the reality of years of dumbing the public down on computer literacy. Nobody knows how a computer works and what you can do with it (other than facebook or twitter). All they care is what they see and what others do with computers.

At what point do you call anything more than binary gates Bloatware? It’s >> ALL << bloatware.

No bloteware starts as soon as you get features you don’t want or need.

Thanks about the Rule 110, I have never come across that. In my defense I never did any formal study of computer science basically I am an Electronics engineer / physicist.

The end user DOESN’T CARE that under the hood the current Mac OS is actually a form of UNIX.

True but then I didn’t think this was about end users.

and I do mean evrything

As are vowels :D