home education – let parents alone!

It is now some years since our two daughters finished their home education, and we had few problems.  However, we  know that some home educating in other parts of the country had great problems with their LEAs (local education authorities) many of whom did not understand the laws on compulsory education and often thought that it was impossible to educate without a timetable!

We chose to home educate based partly on our own experiences of school and partly by meeting the children of other home educating families and being amazed at their maturity and balance compared to other children of their age.  While we made an explicit decision, others are forced into home education, sometimes through learning difficulties or dyslexia, sometimes through school phobia.

One woman I knew eventually decide to home educate her son when he was 14.  At 10 he became school phobic due to a teacher, who was notorious for making his children unhappy; for four years she cooperated with the local authority as they tried to get him back into school, including being sent into short periods of residential care.  It was only when it was clear that he was going to get to 16 with no GCSEs and no future that she reluctantly took him out of the school system and he eventually obtained several exams studying at home with her help.

My wife and I were fortunate in our dealings with authorities as we were obviously well educated, could write fluently and persuasively, and knew the law and our own rights inside out (and were helped enormously by the support group Education Otherwise).  However, not all home educating parents have our advantages, and the difficulties and costs of home education are exacerbated by sometimes intimidating demands from education welfare officers or LEAs.

My impression was that, during the period of our daughters’ education, things improved and LEAs better understood home education.  However, I recently heard (due to a petition on the Downing Street web site) that, I guess as part of the interminable re-hashing of all sectors of education, things are being made more difficult again by repeated reviews of the legal status of home education.

There are numerous examples of public figures from artists to US presidents1 who have been home educated and all the home educated children that I have known, although having all the pressures and problems of any child growing up, are in their various ways successfully following their chosen paths.  When so many aspects of our education system are under threat, I wonder why on earth government feels the need to meddle with things that have and continue to work well.

The petition:

We the undersigned petition the Prime Minister to to remind his government that parents must remain responsible in law for ensuring the welfare and education of their children and that the state should not seek to appropriate these responsibilities.
http://petitions.number10.gov.uk/Homeedreview/

  1. another support group home-education.org.uk have  a list of famous home-educated people[back]

persistent URLs … pleeeease

I was just clicking through a link from my own 2000 publication list to an ACM paper1, and the link is broken!  So what is new, the web is full of broken links … but I hate to find one on my own site.  The URL appears to be one that is semantic (not one of those CMS “?nodeid=3179” web pages):

http://www.acm.org/pubs/citations/journals/tochi/2000-7-3/p285-dix/

At the time I used the link it was valid;  however, the ACM have clearly changed their structure, as this kind of material is all now in the ACM digital library, but they did not leave permanent redirects in place.  This would be forgiveable if the URL were for a transient news item, but TOCHI is intended to be an archival publication … and yet the URL is not regarded as persistent!  If ACM, probably the largest professional computing organisation in the world, cannot get this right, what hope for any of us.

I will fix the link, and now-a-days tend to use ACM’s DOI link as this is likely to be more persistent, however I can do this only because it is my own site.

So, if you are updating a site structure yourself ..  please, Please, PLeeeeEASE make sure you keep all those old links alive2!

  1. BTW the paper is:

    Dix, A., Rodden, T., Davies, N., Trevor, J., Friday, A., and Palfreyman, K. 2000. Exploiting space and location as a design framework for interactive mobile systems. ACM Trans. Comput.-Hum. Interact. 7, 3 (Sep. 2000), 285-321. DOI= http://doi.acm.org/10.1145/355324.355325

    and it is all about physical and virtual location … hmmm[back]

  2. For the ACM or other large sites this would be done using some data-driven approach, but if you are simply restructring your own site and you are using an Apache web server, just add a .htaccess file to your web and add Redirect directives in it mapping old URls to new ones. For example, for the paper on the ACM site:

    Redirect /pubs/citations/journals/tochi/2000-7-3/p285-dix/ http://doi.acm.org/10.1145/355324.355325

    [back]

just hit search

For years I have heard anecdotal stories of how users are increasingly unaware of the URL itself (and certainly the term,  ‘web address’ is sometimes better).  I recall having a conversation at a university meeting (non-computing) and it soon became obvious that  the term ‘browser’ was also not one they were familiar with even though they of course used it daily.  I guess like the mechanics of the car engine, the mechanics of the web are invisible.

I came across the Google Zeitgeist 2008 page that analyses the popular and the rising search terms of 2008.  The rising ones reveal things in the media “sarah palin” way in there above “obama” in the global stats.  … if Google searches were votes!  However, the ‘most popular’ searches reveal longer term habits.  For the UK the 10 most popular searches are:

  1. facebook
  2. bbc
  3. youtube
  4. ebay
  5. games
  6. news
  7. hotmail
  8. bebo
  9. yahoo
  10. jobs

Some of these terms ‘games’, ‘news’, and ‘jobs’ (no Steve, not you) are generic categories … and suggests that people approach these from the search box, not a portal.  However, of these top 10, seven of them are simply domain names of popular sites.  Instead of typing this into the address bar (which certainly on Firefox autocompletes if I type any I’ve visited before), many users just Google it (and I’m sure the same is true for LiveSearch and others).

I was told some years ago that AOL browsers swapped the relative sizes (and locations I think) of the built-in search box and address bar on the assumption that their users rarelt tyoed in URLs (although I knew of AOL users who accidentally typed URLs into the search box).  Also recalling the company that used to sell net keywords that were used by Netscape (and possibly others) if you entered terms rather than a URL into the adders bar.

… of course if I try that now … FireFox  redirects me through Google “I feel lucky” … of course

Incidentally I came to this as I was trailing back the source of the, now shown to be incorrect, Sunday Times news story that said two Google seaches used the same electricity as boiling an electric kettle.  This got challenged in a TechCrunch blog, refuted by Google, and was effectively (but not explcitly) retracted in subsequent Times online item.  The source turns out to be a junior Harvard physicist, Alex Wissner-Gross, whose own source was a blog by Rolf Kersten, one of the Sun Green Team (Sun the computer manufacturer not Sun the newspaper!), so actually not an unreasonable basis.

In fact Rolf Kersten’s estimate, which was prepared for a talk in 2007, seemed to be based on sensible calculations, although he has recently posted a blog saying the figure was out by a factor of 35 … yes it actually takes 70 Google searches to boil that kettle.  Looking deeper the cause of the discrepancy appears to be the figure he used for the number of Google searches per day.  He took 2005 data about the size of the Google server farm and used a figure of 40 million searches per day.  Although Google did not publish their full workings in their response, it is clearly this figure of 40 million hits that was way too low for 2005 as a Feb 2001 Google press release quoted 60 million searches per day in 2000.  Actually with a moment’s reflection it is clear that 40 million hits per day (500 per second) would hardly have justified a major server farm and the figure is clearly in the billions.  However, it is surprisingly difficult to find the true figure and if you Google “google searches per day” you simply find lots of people asking the same question.  In fact, it was through looking for further Google press releases to find a more up-to-date figure that got me to the Zeitgeist page!

A Eamonn Fitzgerald’s Rainy Day blog nicely lays out the timeline of this story and sees it as a triumph of the power of media consumers to challenge the authority of the press due to what Jay Rosen refers to as  ‘audience atomization‘.   Fitzgerald also sees the paradox that the story itself was sourced from the somewhat broken sources on the internet; in the past the press would have perhaps used more authoritative sources … and as I noted couple of years ago at a Memories for Life panel at the British Library, the move from BBC to YouTube could be read as mass democratisation … or simply signal the end of history.

There is another lesson though, one that I picked up in a blog “keeping track of history” not long after the Memories for Life meeting, just how hard it is to find pretty straightforward information on the web.  At that point I was after Tony Blair’s statement about the execution of Saddam Husssein, in this case trying to find out the number of Google search hits.  Neither are secret, propriety or obscure, but both difficult to track down.

… but we still trust that single hit of a search button

making life easier – quick filing in visible folders

It is one of those things that has bugged me for years … and if it was right I would probably not even notice it was there – such is the nature of good design, but …  when I am saving a file from an application and I already have a folder window open, why is it not easier to select the open folder as the destination.

A scenario: I have just been writing a reference for a student and have a folder for the references open on my desktop. I select “Save As …” from the Word menu and get a file selection dialogue, but I have to navigate through my hard disk to find the folder even though I can see it right in front of me (and I have over 11000 folders, so it does get annoying).

The solution to this is easy, some sort of virtual folder at the top level of the file tree labelled “Open Folders …” that contains a list of the curently open folder windows in the finder.  Indeed for years I instinctively clicked on the ‘Desktop’ folder expecting this to contain the open windows, but of course this just refers to the various aliases and files permamently on the desktop background, not the open windows I can see in front of me.

In fact as Mac OSX is built on top of UNIX there is an easy very UNIX-ish fix (or maybe hack), the Finder could simply maintain an actual folder (probably on the desktop) called “Finder Folders” and add aliases to folders as you navigate.  Although less in the spirit of Windows, this would certainly be possible there too and of course any of the LINUX based systems.  … so OS developers out there “fix it”, it is easy.

So why is it that this is a persistent and annoying problem and has an easy fix, and yet is still there in every system I have used after 30 years of windowing systems?

First, it is annoying and persistent, but does not stop you getting things done, it is about efficiency but not a ‘bug’ … and system designers love to say, “but it can do X”, and then send flying fingers over the keyboard to show you just how.  So it gets overshadowed by bigger issues and never appears in bug lists – and even though it has annoyed me for years, no, I have never sent a bug report to Apple either.

Second it is only a problem when you have sufficient files.  This means it is unlikely to be encountered during normal user testing.  There are a class of problems like this and ‘expert slips’1, that require very long term use before they become apparent.  Rigorous user testing is not sufficient to produse usable systems. To be fair many people have a relatively small number of files and folders (often just one enormous “My Documents” folder!), but at a time when PCs ship with hundreds of giga-bytes of disk it does seem slighty odd that so much software fails either in terms of user interface (as in this case) or in terms of functionality (Spotlight is seriously challenged by my disk) when you actually use the space!

Finally, and I think the real reason, is in the implementation architecture.  For all sorts of good software engineering reasons, the functional separation between applications is very strong.  Typically the only way they ‘talk’ is through cut-and-paste or drag-and-drop, with occasional scripting for real experts. In most windowing environments the ‘application’ that lets you navigate files (Finder on the Mac, File Explorer in Windows) is just another application like all the rest.  From a system point of view, the file selection dialogue is part of the lower level toolkit and has no link to the particular application called ‘Finder’.  However, to me as a user, the Finder is special; it appears to me (and I am sure most) as ‘the computer’ and certainly part of the ‘desktop’.  Implementation architecture has a major interface effect.

But even if the Finder is ‘just another application’, the same holds for all applications.  As a user I see them all and if I have selected a font in one application why is it not easier to select the same font in another?  In the semantic web world there is an increasing move towards open data / linked data / web of data2, all about moving data out of application silos.  However, this usually refers to persistent data more like the file system of the PC … which actually is shared, at least physically, between applications; what is also needed is that some of the ephemeral state of interaction is also shared on a moment-to-moment basis.

Maybe this will emerge anyway with increasing numbers of micro-applications such as widgets … although if anything they often sit in silos as much as larger applications, just smaller silos.  In fact, I think the opposite is true, micro-applications and desktop mash-ups require us to understand better and develop just these ways to allow applications to ‘open up’, so that they can see what the user sees.

  1. see “Causing Trouble with Buttons” for how Steve Brewster and I once forced infrequent expert slips to happen often enough to be user testable[back]
  2. For example the Web of Data Practitioners Days I blogged about a couple of months back and the core vision of Talis Platform that I’m on the advisory board of.[back]

Backwards compatibility on the web

I just noticed the following excerpt in the web page describing a rich-text editing component:

Supported Browsers (Confirmed)
… list …

Note: This list is now out of date and some new browsers such as Safari 3.0+ and Opera 9.5+ suffer from some issues.
(Free Rich Text Editor – www.freerichtexteditor.com)

In odd moments I have recently been working on bringing vfridge back to life.  Partly this is necessary because the original Java Servlet code was such a pig1, but partly because the dynamic HTML code had ‘died’. To be fair vfridge was produced in the early days of DHTML, and so one might expect things to change between then and now. However, reading the above web page about a component produced much more recently, I wonder why is it that on the web, and elsewhere, we are so bad at being backward compatible … and I recall my own ‘pain and tears‘ struggling with broken backward compatibility in office 2008.

I’d started looking at current  rich text editors after seeing Paul James’ “Small, standards compliant, Javascript WYSIWYG HTML control“.  Unlike many of the controls that seem to produce MS-like output with <font> tags littered randomly around, Paul’s control emphasises standards compliance in HTML, and is using the emerging de-facto designMode2 support in browsers.

This seems good, but one wonders how long these standards will survive, especially the de facto one, given past history; will Paul James’ page have a similar notice in a year or two?

The W3C approach … and a common institutional one … is to define unique standards that are (intended to be) universal and unchanging, so that if we all use them everything will still work in 10,000 years time.  This is a grand vision, but only works if the standards are sufficiently:

  1. expressive so that everything you want to do now can be done (e.g. not deprecating the use of tables for layout in the absence of design grids leading to many horrible CSS ‘hacks’)
  2. omnipotent so that everyone (MS, Apple) does what they are told
  3. simple so that everyone implements it right
  4. prescient so that all future needs are anticipated before multiple differing de facto ‘standards’ emerge

The last of those is the reason why vfridge’s DHTML died, we wanted rich client-side interaction when the stable standards were not much beyond transactions; and this looks like the reason many rich-text editors are struggling now.

A completely different approach (requiring a  degree of humility from standards bodies) would be to accept that standards always fall behind practice, and design this into the standards themselves.  There needs to be simple (and so consistently supported) ways of specifying:

  • which versions of which browsers a page was designed to support – so that browsers can be backward or cross-browser compliant
  • alternative content for different browsers and versions … and no the DTD does not do this as different versions of browsers have different interpretations of and bugs in different HTML variants.  W3C groups looking at cross-device mark-up already have work in this area … although it may fail the simplicty test.

Perhaps more problematically, browsers need to commit to being backward compatible where at all possible … I am thinking especially of the way IE fixed its own broken CSS implementation, but did so in a way that broke all the standard hacks that had been developed to work around the old bugs!  Currently this would mean fossilising old design choices and even old bugs, but if web-page meta information specified the intended browser version, the browser could selectively operate on older pages in ways compatible with the older browsers whilst offering improved behaviour for newer pages.

  1. The vfridge Java Servlets used to run fine, but over time got worse and worse; as machines got faster and JVM versions improved with supposedly faster byte-code compilers, strangely the same code got slower and slower until it now only produces results intermittently … another example of backward compatibility failing.[back]
  2. I would give a link to designMode except that I notice everyone else’s links seem to be broken … presumably MSDN URLs are also not backwards compatible 🙁 Best bet is just Google “designMode” [back]

Dublin, Guiness and the future of HCI

I wrote the title of this post on 5th December just after I got back from Dublin.  I had been in Dublin for the SIGCHI Ireland Inaugural LectureHuman–Computer Interaction in the early 21st century: a stable discipline, a nascent science, and the growth of the long tail” and was going to write a bit about it (including my first flight on and off Tiree) … then thought I’d write a short synopsis of the talk … so parked this post until the synopsis was written.

One month and 8000 words later – the ‘synopsis’ sort of grew … but just finished and now it is on the web as either HTML version or PDF. Basically a sort of ‘state of the nation’ about the current state and challenges for HCI as a discipline …

And although it now fades a little I had great time in Dublin meeting, talking research, good company, good food … and yes … the odd pint of Guiness too.

sunrise

One of those mornings when Iona and the Ross of Mull peep over the horizon, islands floating above a sea of light.  Then the sun breaks, itself fluid, a drop of steel from the furness, burning gold flowing like mercury.

Christmas services

Lots of travelling about to see family and lots of deadlines over Christmas period, but now back on Tiree and it’s New Year’s day, so time for a short breather.

All the travelling about meant we didn’t seem to get to as many carol services and the like before Christmas, so it didn’t really feel as if Christmas Day was near until it was upon us.

Miriam is in the choir at Birmingham Cathedral and was singing at both the midnight service on Christmas Eve and the Christmas Morning service, so we all spent Christmas with her.  To be honest I expected the Cathedral services to be beautiful, but lifeless affairs.  I was very wrong.

The midnight service was full-on bells-and-smells with smoking incense wafted over Bible, congregation and anything else in range.  Poor Fiona, who had a cough anyway, struggled to breath through the rich spicy fumes and one other girl had to step out for air.  As well as the people, but I’ll come to those later, the turning point to me was when the priest, dressed in a white surplus with beautiful and intricate embroidery,  came to the centre of the aisle to read the New Testament lesson, flanked on either side by candle bearers in cassocks …  of course all heavily wafted in incense … and she began to read.   Clearly she too found it hard to breath in the smoke (Miriam told us later that the incense was only occasionally used in the services, so she had little practice reading under adverse conditions), but, as her head moved from side to side reading the lesson, the light caught a sparkle of glitter in her hair and suddenly the solemnity of the occasion, her own humanity and the joy of the celebration all came together.  A sparkle in the hair a parable for the message of incarnation.

To be honest tired after so much driving, thesis reading, etc., I fell asleep through some of the sermon … and yes, sometimes if you notice me in a talk with my eyes closed it is rapt attention, but sometimes …

Christmas services are always times which bring in the once-a-year visitors as well as the weekly faithful, so always difficult for the preacher.  Some primarily address the latter, hoping that the former will absorb the message and the occasion, often appropriate in small communities.  The opposite alternative is to treat it as an ‘opportunity for evangelism’ addressing the newcomer, but risking treating them as gospel cannon fodder and maybe even neglecting to focus on the celebration that is Christmas. This seems particularly difficult in a cathedral, which is both a place of worship, but also part of civic life; there to serve the city, to welcome people in on their own terms and yet offer them more.

Again, being honest, I expected the preacher to ‘play safe; bland words and a Christmas greeting, but instead (between my naps) he seemed to get it just right, not ‘preachy’ but welcoming yet leaving no doubt that we were not gathered just to sing a few nice songs, but there to celebrate the birth of a real baby, who became a real man, offering a rich promise, and making real demands on our lives.

However, what struck me most was not so much the ‘front activity’ of  priests and candles, lectern and Bible, or even bread and wine, but the people in the congregation. So varied: the lady in ‘Sunday best’ everybody’s image of the librarian or old-style school teacher, three men with tightly cropped hair like the Mitchell brothers in Eastenders, a group of young women maybe going to a party later, a girl in a heavy-metal sweat shirt, and a young couple punctuating their devotions with the occasional hug.  I wondered at their stories, some regulars, some who, like us, had come especially for the service, some who maybe wandered in.

On the Sunday after Christmas we were all in Kendal and went to the morning service at Sandylands, our regular church while we’d lived at Kendal.  Being the post-Christmas service, one of the readings was the familiar story of when Mary and Joseph took the new baby Jesus to the temple to make their offering of two doves after the birth.  Simeon is an old man but had been told that he wouldn’t die until he has seen the Messiah … he sees the baby Jesus and recognises him.

As this was read, the mathematician in me woke up and I did some quick sums, probably a few 100,000 in Palestine at that time, birth rate for replacement maybe 3%, so thousands of babies a year brought to the temple, maybe a few dozen babies every day … and out of  all these Simeon spots one.  I was reminded of Ursula le Guin’s book “A Wizard of Earthsea“.  It is set in a world of oceans and islands … not so far different from the Hebrides … all interlaced with magic.  The young wizard Sparrowhawk is on a quest and in a tower in the far north is lead into a deep room where a great treasure is stored.  The room is empty, but amongst the flagstones on the floor, he spots one, a small insignificant stone, worn from depths of ages, but so so old and holding a deep ancient dark power.  Like Sparrowhawk, Simeon spots the one of great value amongst the many; does it take special eyes or a special gift to see, or just willingness to look?

Although the old dark powers in “A Wizard of Earthsea” are just a story, the Christmas celebrations are themselves set at midwinter at the time of many old pre-Christian festivals.  The early Christians saw no problem in embracing pagan places and even names  (even the English word Easter!) so long as they were consistent with the Christian purpose. Indeed,  it is commonly assumed, there being no indication in the Gospels as to the time of year when Jesus was born, that the early Christians chose the Roman feast of Sol Invictus, the official State sun god so as to declare Christ the Son’s place at the heart of formal society; like those in Birmingham today, treading a difficult line between personal devotion and civic religion.

The parties, eating and drinking of Christmas capture some of the spirit of pagan midwinter, celebrating the sun’s return, and not entirely absent from cathederal mass … see the sparkle in the hair amongst the robes and dress coats of formal civic celebration, but both raw humanity and ordered civilisation give way to something greater.   There is a logic that is not the tit-for-tat logic of base human nature, nor the clinical logic of pure reason (although both are part of us), but instead is a logic that takes all the problems and pain of the world and answers them in a tiny wriggling baby.

short story for Christmas

A few years ago I wrote a short children’s story “Christopher’s Christmas“.  I put a PDF online at the time, but just put the whole thing online as HTML to read in the browser.

This was partly written because it is too easy to forget how difficult things are for little children, and also as a personal catharsis remembering myself, as a slightly older child, on my first Christmas after my own dad died.

… and no, I didn’t get a 5 year old to draw the pictures … just that I draw like a 5 year old :-/

reading: Computing is a Natural Science

I’ve just been re-reading Peter Denning’s article “Computing is a Natural Science1.    The basic thesis is that computation as broad concept goes way beyond digital computers and many aspects of science and life have a computational flavour; “Computing is the study of natural and artificial information processes“.  As an article this is in some ways not so controversial as computational analogies have been used in cognitive science for almost as long as there have been computers (and before that analogies with steam engines) and readers of popular science magazines can’t have missed the physicists using parallels with information and computation in cosmology.  However, Denning’s paper is to some extent a manifesto, or call to arms: “computing is not just a load of old transistor, but something to be proud of, the roots of understanding the universe”.

Particularly useful are the “principles framework for computing” that Denning has been constructing through a community discussion process.  I mentioned these before when blogging about “what is computing? The article lists the top level categories and gives examples of how each of these can be found in areas that one would not generally think of as ‘computing’.

Computation (meaning and limits of computation)
Communication
(reliable data transmission)
Coordination
(cooperation among networked entities)
Recollection
(storage and retrieval of information)
Automation
(meaning and limits of automation)
Evaluation
(performance prediction and capacity planning)
Design
(building reliable software systems)
categories from “Great Principles of Computing

Occasionally the mappings are stretched … I’m not convinced that natural fractals are about hierarchical aggregation … but they do paint a picture of ubiquitous parallels across the key areas of computation.

Denning is presenting a brief manifesto not a treatise, so examples of very different kinds tend to be presented together.  There seem to be three main kinds:

  • ubiquity of computational devices – from iPods to the internet, computation is part of day-to-day life
  • ubiquity of computation as a tool –  from physical simulations to mathematical proofs and eScience
  • ubiquity of computation as an explanatory framework – modelling physical and biological systems as if they were performing a computational function

It is the last, computation as analogy or theoretical lens that seems most critical to the argument as the others are really about computers in the traditional sense and few would argue that computers (rather than computation) are everywhere.

Rather like weak AI, one can look treat these analogies as simply that, rather like the fact that electrical flow in circuits can be compared with water flow in pipes.  So we may feel that computation may be a good way to understand a phenomena, but with no intention of saying the phenomena is fundamentally computational.

However, some things are closer to a ‘strong’ view of computation as natural science.  One example of this is the “socio-organisational Church-Turing hypothesis”, a term I often use in talks with its origins in a 1998 paper “Redefining Organisational Memory“.  The idea is that organisations are, amongst other things, information processing systems, and therefore it is reasonable to expect to see structural similarities between phenomena in organizations and those in other information processing systems such as digital computers or our brains.  Here computation is not simply an analogy or useful model, the similarity is because there is a deep rooted relationship – an organisation is not just like a computer, it is actually computational.

Apparently absent are examples of where the methods of algorithmics or software engineering are being applied in other domains; what has become known as ‘computational thinking‘. This maybe because there are two sides to computing:

  • computing (as a natural science) understanding how computers (and similar things) behave – related to issues such as emergence, interaction and communication, limits of computability
  • computing (as a design discipline) making computers (and similar things) behave as we want them to – related to issues such as hierarchical decomposition and separation of concerns

The first can be seen as about understanding complexity and the latter controlling it.  Denning is principally addressing the former, whereas computational thinking is about the latter.

Denning’s thesis could be summarised as “computation is about more than computers”.  However, that is perhaps obvious in that the early study of computation by Church and Turing was before the advent of digital computers; indeed Church was primarily addressing what could be computed by mathematicians not machines!  Certainly I was left wondering what exactly was ubiquitous: computation, mathematics, cybernetics?

Denning notes how the pursuit of cybernetics as an all embracing science ran aground as it seemed to claim too much territory (although the practical application of control theory is still live and well) … and one wonders if the same could happen for Denning’s vision.  Certainly embodied mind theorists would find more in common with cybernetics than much of computing and mathematicians are not going to give up the language of God too easily.

Given my interest in all three subjects, computation, mathematics, cybernetics, it prompts me to look for the distinctions, and overlaps, and maybe the way that they are all linked (perhaps all just part of mathematics ;-)).  Furthermore as I am also interested in understanding the nature of computation I wonder whether looking at how natural things are ‘like’ computation may be not only an application of computational understanding, but more a lens or mirror that helps us see computation itself more clearly.

  1. well when I say re-reading I think the first ‘read’ was more of a skim, great thing about being on sabbatical is I actually DO get to read things 🙂 [back]