Digital illiteracy?

http://news.bbc.co.uk/1/hi/programmes/click_online/9503255.stm

Does it run DOOM?

Wait, nobody knows that one anymore. Make it Duke Nukem Forever...

Seriously, what school will use a product that doesn't come with windows preloaded? They're all indoctrinated on neuron level that windows IS the world of computing. The rest is illegal at best. This is a nice example, maybe known already. Still a good read.

What is this "Duke Nukem Forever"?
Is it anything like "Duke Nukem 3d Atomic edition or plutonium edition"?

Lol I wish they would teach programming but only using Open source programs, that way kid will learn how to build proper software.

Cowabunga!

No, that was Mr. Lo Wang (Shadow Warrior) :wink:

"The shuriken! I love the shuriken! i love it! i used to go to the movies with a pocket full of them! i throw the shuriken in the dark... and where it lands, nobody knows... [yelp!] there! one landed!"


Personally I think open source software and especially software that only uses non-proprietary file formats should be mandatory for schools and governments. In Germany the department of foreign affairs migrated from MS to some sort of linux a few years ago. Now they're migrating back to MS... Apparently the users complained that they couldn't run their favourite software, probably solitaire and silly powerpoint slideshows sent via email. I don't see why schools and governments should be dependent on a company with as quasi monopoly. Ever tried to buy a computer and not pay the MS tax? Pirates. I don't mind them trying to make money, but I demand to have a choice to avoid them.

Computer/digital illiteracy works in favour of MS. People get locked in after some time.

I'm per definition the admin of my parents' computer. It was running XP for several years, using Acronis TrueImage for backups. A very good product btw, especially for bare-metal recovery. I created the initial backup and let it run scheduled. Restoring was easy enough for my dad to do it himself. The last time it gave me a headache I just installed openSUSE and added a virtual XP machine using virtualbox. I also bought a good book for linux newbies that started something like this: "... 'root directory' .... HELP, where am I?...". No more trouble with viri as well.

The virtual XP machine hasn't been used ONCE since then.

I'm not sure my dad could install or re-install his machine from scratch (without wiping out his user data) or use a terminal without me giving instructions via the phone. It seems he now just expects the machine to run. And if it doesn't I'm informed that I should pay them a visit and be recompensed with a good meal.

It highlights a blatant failure on the part of the software development world to address issues such as usability, intuitiveness and workflow management by creating usable, intuitive and effective languages. User-centred approaches to application and web application development has realised these shortcomings long ago and we have apps that rank highly in usability, intuitiveness and flow. Languages on the other hand seem to have been marooned back in the ’60s by a clique of beardy self-flagellating programmers that learned pointlessly badly designed languages the hard way and believe that this is the only way to achieve true power, and that everyone should continue to suffer. This has retarded the possibility of languages that are actually nice to use, and consequently as a result, we’re in precisely the situation indicated in the indicated article.

Well, but how on earth do you tell a computer what to do with a language that is imprecise? Of course it would be nice to have the software implement "Don't do what I say, do what I want/intend/need." or similar. It's quite impossible to make a machine without intuition, common sense and humour do what you want if you can't precisely write down what it should do. Most likely you don't fully understand the problem yourself if you can't do that. I have a hard time with c++ myself, all this class business, inheritance, the dreaded 'this->...', but fortunately I don't need any of it for my projects.

You don't have to go down that deep to observe digital illiteracy btw. I don't want to know how many people still click on links in emails with questionable origin. At some point everybody becomes illiterate, but maybe the threshold level is getting lower these days.

"When I was a boy..."

Did I say “imprecise”? Where did you get “imprecise” from? Are you under the impression that it has been mentioned prior to your use of it in this conversation? Why would you interpret anything said so far as equating with “imprecise”?

Well, maybe I got it wrong.

The opposite of what you've declared as 'bad', to me sounds like dragging and dropping a few very colourful icons onto a 'sketch' and wiring them together for proper processing. Something like that is contra-productive as well.

Languages on the other hand seem to have been marooned back in the ’60s by a clique of beardy self-flagellating programmers that learned pointlessly badly designed languages the hard way

I still think computing is pretty much at the stage steam engines were before the safety valve or the centrifugal governer were invented.

Ian, if you're worried about 60's era languages, don't use 60's era architectures with 60's era capabilities.

Well, in point of fact, coding in assembly language gives you a lot of power over what the machine does. ]:slight_smile: And in fact, I had been writing COBOL, BASIC, and FORTRAN before I did any serious work in assembly language, and it was doing that which made me a much better programmer in any language.

There are, in fact, reasons for the design decisions in any language. They aren't pointless. You might not agree with the design parameters, put in fact they can be quite pointed (or pointy, or use pointers). The verbose monstrosity known as COBOL was designed, on purpose, to be verbose, so that it would be closer to being self-documenting. FORTRAN -- FORmula TRANslator -- was designed as a scientific language. And back then, compute cycles, storage, and memory, were all hideously expensive. Lots of compromises went into programming back then. I wrote a lot of the code that people later bitched about when Y2K came around. But I guess nobody remembers that there was a time when you saved 2 bytes if you could.

And when it comes to writing code, well, you are still going to need statements for addition, subtraction, looping, etc., and it's difficult to envision how to state those things in ways that don't sound like -- well, what they are.

There are, however, many, many modern languages from which to choose. I couldn't begin to list them all. Perl, for example, is a far cry from FORTRAN. Some people hate it. But I wouldn't describe Larry Wall as someone who's marooned in the sixties. (He doesn't have beard either.)

Referring back to the original article, digital illiteracy is a problem. I hear this a lot from my friends who are still working in the IT industry. Lots of younger coders are coming out of college without understanding what real world programming is about, and haven't learned the sort of fundamental logical skills that comes from writing languages such as C or FORTRAN. Pulling up an IDE and stringing together a bunch of pasted-in Java classes isn't programming. I wonder how many recent Comp-Sci graduates even know who Donald Knuth is.

BTW, I do have a beard, but I don't whip myself these days. :slight_smile:

ETA: Oh, and Linux Rules! What better platform to use, than one which will let you delve into any aspect you want -- even to the point of kernel programming. And where all the code is there for you to read. If you want to really learn how to write good code, you should probably be reading good code. Try doing that with your closed source Windoze boxen.

What I'm wondering is why this "Raspberry Pi" device?

Sure, it seems like a nice system, and very low price - but in the end, its a small computer running Ubuntu. Any "game development" teaching is likely being done with PyGame, (and SDL, PyOpenGL, etc). So why not just use a bootable USB stick with Ubuntu on it (cheaper than the Raspberry Pi) - and then give the Raspberry Pi to those who don't own or can't afford a full sized PC; that would seem to be the best use of all resources.

Ultimately, though - such a machine, though considered "small" by today's standards, is still a very far cry from the "small" 8-bit machines akin to the BBC Micro and others (in my case, it was a TRS-80 Color Computer). There's nothing with the Raspberry Pi that couldn't be done with a full-sized machine, and regardless of which is used, children still wouldn't have any clue as to what is -really- going on under the hood.

If they were smart, they'd give each kid a Nootropic Design Hackvision kit, and teach them how to put it together, what the parts are for and how/why they work, use it for teaching how to create games and other small applications, as well as interfacing other controllers or custom controllers, show how to run motors for vibration feedback, etc.

Slightly more expensive, sure - but waaaaay more accessible to kids (and adults), and more understandable, since it is still small, 8-bit, simple monochrome graphics, etc.

One other thing I want to note, regarding the article: The article bemoans the idea that kids are becoming "digitally illiterate", but I would argue that many adults - including many in computer science - are just as illiterate! At least here in the States, it seems many (not all - there's still good ones around if you look) supposed "computer science" courses are nothing but glorified "learn to program in Java" (or some other language) courses. It seems there is little to no emphasis placed on algorithms and other "base/core/foundational" curricula of computer science. Some of the graduates of these schools can barely code a simple calendar, let alone tell you what a Turing machine is, why NOR/NAND gates are important, etc (I could really go on and on - suffice to say, many of these foundational concepts in computer science and the history of computation are not taught, or are barely glossed over).

You ask them about Rule 110 and they stare at you blankly...

You ask them about Rule 110 and they stare at you blankly...

Stares back at you blankly. Is it a US thing?

Rule 110 is the one that comes right after Rule 101

Allow me to put aside for the moment the notion of language syntaxes that have appalling usability which those heroes that have somehow accomplished fluency in will defend to the utmost of idiocy, forever closed to the possibility that improvement may be possible. This clearly isn’t the venue to suggest such heresy, I’m talking to a brick wall in that respect. However, I also have a different point, and it is this:

Personally, I think that a significant portion of the problem is that there is no obvious visible connection that anyone can at first see between a problem and its computer-implemented solution. It’s as simple as that. Almost no program I’ve ever seen and understood goes about things in the way I would if it were me pretending to be a computer. Almost all actual working solutions are not only mysteriously arcane, but seem to be the product of an alien mentality. You can look at a lot of programs, and if you are told what it is supposed to achieve and shown how it actually does it, you’d think that nobody in a month of sundays would ever arrive at such a convoluted and illogically unintuitive way of getting there. It’s all just so wrong! This is why people can’t learn it.

You can teach people the funny words (or keywords), teach people the stupid syntax, teach people to not do what they’ve always done (for example, “=” doesn’t mean that any more) and teach people about making sure the excessive amounts of punctuation match up, etc. But none of this actually teaches people how to make a program achieve what is required. None of this actually teaches people how to arrive at a solution. None of it actually teaches people how to turn a requirement into a program.

You can know the syntax all you like, but that won’t make a program happen if you can’t see the structural link between what you want to have happen and what alternative alien structures you’re going to transform that comprehension into. What is being taught is how to recognise the syntax to the level that allows you to cut and paste existing programs (for that is where the magic is contained) enough to botch together your own adapted implementation. And that’s as far as it goes.

Ian,
Tell me.
Can you read Cyrillic?
Pitman shorthand?
A knitting pattern?

I feel pretty much the way you feel about computer languages when I look at sheet music.
(I was never taught how to read music)

(for example, “=” doesn’t mean that any more

That's an interesting one.
To me = means equals, so, if I write ax2 + bx + c = 0, I know what that means.
But then some joker in a computer lab says "x = x + 1".
He's clearly into substance abuse.

But then someone else writes "x := x + 1", I can sort of screw up my eyes and squint a little and imagine that the := is actually a left-pointing arrow, so it is clear that he isn't saying that x is the same as x plus one, he's saying that x is assigned the result of the computation of x plus one.

That's what I see when I read a C progream, even though I know the colon isn't there.

It's an associative thing, even though part of the association isn't there, like when I see the number "1664", I don't see "one thousand six hundred and sixty four", I see "seize cent soixante quatre" and I see (in my mind) a nice cool glass of beer.

I feel pretty much the way you feel about computer languages when I look at sheet music.
(I was never taught how to read music)

It is completely off the original topic, but sheet music is actually fairly simple to read. It is based around five lines, and the position of the markings on or between those lines dictate the note:

---------------------------------------------------- E ------
D
-------------------------------------- C --------------------
B
------------------------ A ----------------------------------
G
----------- F -----------------------------------------------
E
D --------------------------------------------------------

The musical alphabet goes A,B,C,D,E,F,G , and one set of these is called an octave.
Now for the markings on it. There are semibreives - they are four beats long, minims - two beats, crotchets - one beat, quavers - 1/2 beat, and semiquavers - 1/4 beat. (There are more notes, but these are the main ones).
And that is the basics of reading sheet music! The five lines are called the staff, and you can get notes above and below it, but they follow the same pattern as the others. For the note symbols, check out wikipedia - they have nice diagrams.

Onions.

Well getting back to the original topic - I also agree that this rasberry pi computer is going to do nothing else to help the real problem of kids not knowing how it works/basic computer hardware theory.

Needless to say I'd like a few if they're going to be that cheap!

@Onions: Interesting, but if there are eight notes to the octave, why five lines?
Or should I be reading between the lines?

Grumpy_Mike:

You ask them about Rule 110 and they stare at you blankly...

Stares back at you blankly. Is it a US thing?

Rule 110 from the domain of cellular automata. I would say it's debatable whether that's an essential part of digital (or computational) literacy. If you're getting into gaming or other world simulation work, then yes.

AWOL:
@Onions: Interesting, but if there are eight notes to the octave, why five lines?
Or should I be reading between the lines?

Yes, if the note symbol is on the line, it counts as one note. If it is between the lines, it is another note. I guess they made it like that to save space when you get a really long piece of music.

Onions.