Digital illiteracy?

Well, but how on earth do you tell a computer what to do with a language that is imprecise? Of course it would be nice to have the software implement "Don't do what I say, do what I want/intend/need." or similar. It's quite impossible to make a machine without intuition, common sense and humour do what you want if you can't precisely write down what it should do. Most likely you don't fully understand the problem yourself if you can't do that. I have a hard time with c++ myself, all this class business, inheritance, the dreaded 'this->...', but fortunately I don't need any of it for my projects.

You don't have to go down that deep to observe digital illiteracy btw. I don't want to know how many people still click on links in emails with questionable origin. At some point everybody becomes illiterate, but maybe the threshold level is getting lower these days.

"When I was a boy..."

Did I say “imprecise”? Where did you get “imprecise” from? Are you under the impression that it has been mentioned prior to your use of it in this conversation? Why would you interpret anything said so far as equating with “imprecise”?

Well, maybe I got it wrong.

The opposite of what you've declared as 'bad', to me sounds like dragging and dropping a few very colourful icons onto a 'sketch' and wiring them together for proper processing. Something like that is contra-productive as well.

Languages on the other hand seem to have been marooned back in the ’60s by a clique of beardy self-flagellating programmers that learned pointlessly badly designed languages the hard way

I still think computing is pretty much at the stage steam engines were before the safety valve or the centrifugal governer were invented.

Ian, if you’re worried about 60’s era languages, don’t use 60’s era architectures with 60’s era capabilities.

[quote author=Ian Tindale link=topic=63069.msg457821#msg457821 date=1307215508] Languages on the other hand seem to have been marooned back in the ’60s by a clique of beardy self-flagellating programmers that learned pointlessly badly designed languages the hard way and believe that this is the only way to achieve true power [/quote]

Well, in point of fact, coding in assembly language gives you a lot of power over what the machine does. ]:) And in fact, I had been writing COBOL, BASIC, and FORTRAN before I did any serious work in assembly language, and it was doing that which made me a much better programmer in any language.

There are, in fact, reasons for the design decisions in any language. They aren't pointless. You might not agree with the design parameters, put in fact they can be quite pointed (or pointy, or use pointers). The verbose monstrosity known as COBOL was designed, on purpose, to be verbose, so that it would be closer to being self-documenting. FORTRAN -- FORmula TRANslator -- was designed as a scientific language. And back then, compute cycles, storage, and memory, were all hideously expensive. Lots of compromises went into programming back then. I wrote a lot of the code that people later bitched about when Y2K came around. But I guess nobody remembers that there was a time when you saved 2 bytes if you could.

And when it comes to writing code, well, you are still going to need statements for addition, subtraction, looping, etc., and it's difficult to envision how to state those things in ways that don't sound like -- well, what they are.

There are, however, many, many modern languages from which to choose. I couldn't begin to list them all. Perl, for example, is a far cry from FORTRAN. Some people hate it. But I wouldn't describe Larry Wall as someone who's marooned in the sixties. (He doesn't have beard either.)

Referring back to the original article, digital illiteracy is a problem. I hear this a lot from my friends who are still working in the IT industry. Lots of younger coders are coming out of college without understanding what real world programming is about, and haven't learned the sort of fundamental logical skills that comes from writing languages such as C or FORTRAN. Pulling up an IDE and stringing together a bunch of pasted-in Java classes isn't programming. I wonder how many recent Comp-Sci graduates even know who Donald Knuth is.

BTW, I do have a beard, but I don't whip myself these days. :)

ETA: Oh, and Linux Rules! What better platform to use, than one which will let you delve into any aspect you want -- even to the point of kernel programming. And where all the code is there for you to read. If you want to really learn how to write good code, you should probably be reading good code. Try doing that with your closed source Windoze boxen.

What I'm wondering is why this "Raspberry Pi" device?

http://en.wikipedia.org/wiki/Raspberry_Pi

Sure, it seems like a nice system, and very low price - but in the end, its a small computer running Ubuntu. Any "game development" teaching is likely being done with PyGame, (and SDL, PyOpenGL, etc). So why not just use a bootable USB stick with Ubuntu on it (cheaper than the Raspberry Pi) - and then give the Raspberry Pi to those who don't own or can't afford a full sized PC; that would seem to be the best use of all resources.

Ultimately, though - such a machine, though considered "small" by today's standards, is still a very far cry from the "small" 8-bit machines akin to the BBC Micro and others (in my case, it was a TRS-80 Color Computer). There's nothing with the Raspberry Pi that couldn't be done with a full-sized machine, and regardless of which is used, children still wouldn't have any clue as to what is -really- going on under the hood.

If they were smart, they'd give each kid a Nootropic Design Hackvision kit, and teach them how to put it together, what the parts are for and how/why they work, use it for teaching how to create games and other small applications, as well as interfacing other controllers or custom controllers, show how to run motors for vibration feedback, etc.

Slightly more expensive, sure - but waaaaay more accessible to kids (and adults), and more understandable, since it is still small, 8-bit, simple monochrome graphics, etc.

One other thing I want to note, regarding the article: The article bemoans the idea that kids are becoming "digitally illiterate", but I would argue that many adults - including many in computer science - are just as illiterate! At least here in the States, it seems many (not all - there's still good ones around if you look) supposed "computer science" courses are nothing but glorified "learn to program in Java" (or some other language) courses. It seems there is little to no emphasis placed on algorithms and other "base/core/foundational" curricula of computer science. Some of the graduates of these schools can barely code a simple calendar, let alone tell you what a Turing machine is, why NOR/NAND gates are important, etc (I could really go on and on - suffice to say, many of these foundational concepts in computer science and the history of computation are not taught, or are barely glossed over).

You ask them about Rule 110 and they stare at you blankly...

You ask them about Rule 110 and they stare at you blankly…

Stares back at you blankly. Is it a US thing?

Rule 110 is the one that comes right after Rule 101

Allow me to put aside for the moment the notion of language syntaxes that have appalling usability which those heroes that have somehow accomplished fluency in will defend to the utmost of idiocy, forever closed to the possibility that improvement may be possible. This clearly isn’t the venue to suggest such heresy, I’m talking to a brick wall in that respect. However, I also have a different point, and it is this:

Personally, I think that a significant portion of the problem is that there is no obvious visible connection that anyone can at first see between a problem and its computer-implemented solution. It’s as simple as that. Almost no program I’ve ever seen and understood goes about things in the way I would if it were me pretending to be a computer. Almost all actual working solutions are not only mysteriously arcane, but seem to be the product of an alien mentality. You can look at a lot of programs, and if you are told what it is supposed to achieve and shown how it actually does it, you’d think that nobody in a month of sundays would ever arrive at such a convoluted and illogically unintuitive way of getting there. It’s all just so wrong! This is why people can’t learn it.

You can teach people the funny words (or keywords), teach people the stupid syntax, teach people to not do what they’ve always done (for example, “=” doesn’t mean that any more) and teach people about making sure the excessive amounts of punctuation match up, etc. But none of this actually teaches people how to make a program achieve what is required. None of this actually teaches people how to arrive at a solution. None of it actually teaches people how to turn a requirement into a program.

You can know the syntax all you like, but that won’t make a program happen if you can’t see the structural link between what you want to have happen and what alternative alien structures you’re going to transform that comprehension into. What is being taught is how to recognise the syntax to the level that allows you to cut and paste existing programs (for that is where the magic is contained) enough to botch together your own adapted implementation. And that’s as far as it goes.

Ian, Tell me. Can you read Cyrillic? Pitman shorthand? A knitting pattern?

I feel pretty much the way you feel about computer languages when I look at sheet music. (I was never taught how to read music)

(for example, “=” doesn’t mean that any more

That's an interesting one. To me = means equals, so, if I write ax2 + bx + c = 0, I know what that means. But then some joker in a computer lab says "x = x + 1". He's clearly into substance abuse.

But then someone else writes "x := x + 1", I can sort of screw up my eyes and squint a little and imagine that the := is actually a left-pointing arrow, so it is clear that he isn't saying that x is the same as x plus one, he's saying that x is assigned the result of the computation of x plus one.

That's what I see when I read a C progream, even though I know the colon isn't there.

It's an associative thing, even though part of the association isn't there, like when I see the number "1664", I don't see "one thousand six hundred and sixty four", I see "seize cent soixante quatre" and I see (in my mind) a nice cool glass of beer.

I feel pretty much the way you feel about computer languages when I look at sheet music. (I was never taught how to read music)

It is completely off the original topic, but sheet music is actually fairly simple to read. It is based around five lines, and the position of the markings on or between those lines dictate the note:

---------------------------------------------------- E ------ D -------------------------------------- C -------------------- B ------------------------ A ---------------------------------- G ----------- F ----------------------------------------------- E D --------------------------------------------------------

The musical alphabet goes A,B,C,D,E,F,G , and one set of these is called an octave. Now for the markings on it. There are semibreives - they are four beats long, minims - two beats, crotchets - one beat, quavers - 1/2 beat, and semiquavers - 1/4 beat. (There are more notes, but these are the main ones). And that is the basics of reading sheet music! The five lines are called the staff, and you can get notes above and below it, but they follow the same pattern as the others. For the note symbols, check out wikipedia - they have nice diagrams.

Onions.

Well getting back to the original topic - I also agree that this rasberry pi computer is going to do nothing else to help the real problem of kids not knowing how it works/basic computer hardware theory.

Needless to say I'd like a few if they're going to be that cheap!

@Onions: Interesting, but if there are eight notes to the octave, why five lines? Or should I be reading between the lines?

Grumpy_Mike:

You ask them about Rule 110 and they stare at you blankly...

Stares back at you blankly. Is it a US thing?

Rule 110 from the domain of cellular automata. I would say it's debatable whether that's an essential part of digital (or computational) literacy. If you're getting into gaming or other world simulation work, then yes.

AWOL: @Onions: Interesting, but if there are eight notes to the octave, why five lines? Or should I be reading between the lines?

Yes, if the note symbol is on the line, it counts as one note. If it is between the lines, it is another note. I guess they made it like that to save space when you get a really long piece of music.

Onions.

I’ve never had to actually read music, and indeed, managed to somehow pass music at school without ever successfully reading it (I knew a lot about synthesizers, it was 1977, the music teachers were quite old-generation, and they probably thought I knew more than I did about music itself).

But, and this is relying on memory of my old Yamaha CX5m Music Composer cartridge I used to try and use in 1985…

Haven’t you got the lines and spaces assigned to the wrong notes? Shouldn’t it be FACE for the spaces, and EGBDF for the lines, reading from the bottom line up?

Oh, and that’s another thing — my natural assumption was shattered surprisingly once when I realised that music is the wrong way round. I’d naturally assumed that the bass frequencies are spatially analogous to “here, near my forehead” and as the frequency increases, it increases in the direction of “toward my toes”. That feels right to me. But no, music is the wrong way round — increased frequencies go the wrong way — up. Same as the piano keyboard — it’d be more natural to me to have the bass at the right end, the treble at the left end, and each time I plinky plonk at a piano keyboard it somehow surprises me that it’s the wrong way round.

Very lively discussion, with music on the background ;)

Let me chip in some stuff I've been thinking since I saw the first post.

1) There are too many digital illiterates compared with how many computers on the planet

2) Computer/digital literacy should be a part of high school/college general education classes (too bad only I and a CSCI professor on my campus would probably vote yes), like there are English literacy classes on composition. Using computer programs is not the complete literacy, like someone can only read but not write, basic programming skill is needed.

3) Programming languages should be taught like other languages, with lots of reading of short to medium length classic programs and find errors on some reading and appreciating the beauty of other reading. I recall I did this accidentally when I got my 4 books of BASIC programs. There were lots of programs that are like simple games, guessing numbers etc. To a 10-year old they were perfect. Most of the editors at the publishing house didn't know BASIC so they made simple printing mistakes here and there and my brother and I were correcting mistakes as we when from one program to another.

Shouldn’t it be FACE for the spaces, and EGBDF for the lines, reading from the bottom line up?

Oops, yes it should. I should have put

----------F----- E --------D------- C ------B--------- A ---G----------- F -E------------- D

To add to the original topic, they do teach programming at our school, but it is not proper programming. We are shown how to program by drawing a flowchart on a computer and getting a big piece of complex software to turn it into code. All you need to do is draw a flowchart. That's all. Nothing else, just draw a flowchart. Then, you press F5 and it uploads the program the computer made onto a microcontroller they call a "genie chip". (Personally, I'd use an AVR :D) To me, this is not programming. It encourages problem solving, which is an important part of programming, but does not properly cover the subject in any depth. HOWEVER, there will always be people like me, and I'm sure other people on this forum, who want to know how things work, and why. It is these people that will learn by choice, take things further than would be possible in schools. Whenever there are people interested in something, there will be people willing to learn. Most people may be digitally illiterate, but there will always be people wanting to learn.

Onions.

I learned BASIC at school - took me ages to get out of that mindest. It wasn't until much later and I studied formal grammars and Michael Jackson's (no, not that one) "System Development" that I really got proper programming constructs, and mapping the problem onto software.

(Oh, and I don't consider the musical excursion as OT)

The thing is, it’s not just computers. I mean, you lot are all computer fanatics, spending most of your day up to your elbows with bytes, J-K flip-flops and conditional tests, but most people aren’t and their perspective of a computer just about equates to their perspective of a toaster. They just get on and use it, fixing it or modifying it is never even considered, just buy a new one at half the price of the old one.

It’s exactly the same in many other domains, I would suspect. On one hand I’ve been a designer since quite a few years before the advent of DTP and to hear people refer to “cut and paste” without a clue how cutting and pasting was actually done is dismaying, but that’s the reality of now. Similarly, originally when I was a professional photographer, I learned how to do unsharp masking, drop shadows, outlining, entirely with film. Now people think those terms originated with Photoshop.

Cameras are an ideal example of how the technology has abstracted away from the user. For most of the 20th century it was possible for a person who owned a camera to read the manual, get used to the concept of exposure being affected simultaneously by varying the aperture setting and varying the shutter speed, and the idea of a smaller aperture giving a greater depth of field, a wider aperture giving a narrower depth of field, a slow shutter speed picking up motion blur and a fast one freezing motion, and the film speed equating with sensitivity. It wasn’t too difficult to grasp, and many people ended up doing so until auto exposure relieved them of that (and auto focus much later). But now we have programmed scenes in digital cameras, and the things we can vary are fundamentally different to the actual mechanistic variables of the original film camera.

Now our variables – completely different variables – are varying an abstracted program, which is someone else’s idea of how the processor should process the Bayer-matrix data. Nobody is in direct control any more. If someone sets the camera to “lady with hat” or “screaming brat” or “distant mountain” or “underwater barbecue with fireworks” what on earth are they actually varying? Nobody knows. Well, no consumer knows. No professional photographer even knows. No camera expert even knows. Except the particular ones that programmed up the way that the “engine” performs the processing, and which abstractions they’ve given us to play with.