an end to tinkering? are iPhones the problem

Thanks to @aquigley for tweeting about the silicon.com article “Why the iPhone could be bad news for computer science“.  The article quotes Robert Harle from the Computer Laboratory at Cambridge worrying that the iPhone (and other closed platforms) are eroding the ability to ‘tinker’ with computers and so destroying the will amongst the young to understand the underlying technology.

I too have worried about the demise of interest not just in computers, but in science and technology in general.  Also, the way Apple exercise almost draconian control over the platform is well documented (even rejecting an eBook application for fear it could be used to read the Karma Sutra!).

However, is the problem the closedness of the platform?  On the iPhone and other smartphones, it is the apps that catch imagination and these are ‘open’ in the sense that it is possible to programme your own.  Sure Apple charge for the privilege (why – the income surely can’t be major!), but it is free in education.  So what matters, app development, is open … but boy is it hard to get started on the iPhone and many platforms.

It is not the coding itself, but the hoops you need to go through to get anything running, with multiple levels of ritual incantations.  First you need to create a Certificate signing request to get Development certificate and a Provisioning profile based on your Device ID … sorry did I lose you, surely not you haven’t even written a line of code yet, for that you really need to understand the nib file … ooops I’ve lost the web page where I read how to do that, wait while I search the Apple Developer site …

Whatever happened to:

10 print "hello world"

This is not just the iPhone, try building your first Facebook app, … or if you are into open standards X Windows!

Nigel Davies said his 7 year old is just starting to code using Scratch. I recall Harold Thimbleby‘s son, now an award winning Mac developer similarly starting  using Hypercard.

If we would like a generation of children enthused by Facebook and the iPhone, to become the next generation of computer scientists, then we need to give them tools to get started as painless and fun as these.

the plague of bugs

Like some Biblical locust swarm, every attempt to do anything is thwarted by the dead weight of innumerable bugs! This time I was trying … and failing … to upload a Word file into Google docs. I uploaded the docx file and it said the file was unreadable, tried saving it as .doc, and when that failed created an rtf file. Amazingly from a 1 Meg word file the rtf was 66 Meg, but very very slowly Google docs did upload the file and when it was eventually all uploaded …

To be fair the same document imports pretty badly into Pages (all the headings disappear).  I think this is because it is originally a 2003 Word file and gets corrupted when the new Word reads it.

Now I have griped before about backward compatibility issues for Word, and in general about lack of robustness in many leading products, and to add to my woes, for the last month or so (I guess after a software update) Word has decided not to show its formatting menus on an opened document unless I first hide them, then show them, and then maximise the window. Mostly these things are annoying, sometimes really block work, and always waste time and destroy the flow of work.

However, rather than grousing once again (well I already have a bit), I am trying to make sense of this.  For some time it has become apparent that software is fundamentally breaking down, in that with every new version there is minimal new useful functionality, but more bugs.  This may be simply issues of scale, of the training of programmers, or of the nature of development processes.  Indeed in the talk I gave a bit over a  year ago to PPIG, “as we may code“, I noted that coding in th 21st Century seems to be radically different, more about finding tricks and community know-how and less about problem solving.

Whatever the reason, I don’t think the Biblical plague of bugs is simply due to laziness or indifference on the part of large vendors such as  Microsoft and Adobe, but is symptomatic of a deeper crisis in software development, certainly where there is a significant user interface.

Maybe this is simply an inevitable consequence of scale, but more optimistically I wonder if there are new ways of coding, new paradigms or new architectural models.  Can 2010 be the decade when software is reborn?