spice up boring lists of web links – add favicons using jQuery

Earlier today I was laying out lists of links to web resources, initially as simple links:

However, this looked a little boring and so thought it would be good to add each site’s favicon (the little icon it shows to the left on a web browser), and have a list like this:

  jQuery home page

  Wikipedia page on favicons

  my academic home page

The pages with the lists were being generated, and the icons could have been inserted using a server-side script, but to simplify the server-side code (for speed and maintainability) I put the fetching of favicons into a small JavaScript function using jQuery.  The page is initially written (or generated) with default images, and the script simply fills in the favicons when the page is loaded.

The list above is made by hand, but look at this example page to see the script in action.

You can use this in your own web pages and applications by simply including a few JavaScript files and adding classes to certain HTML elements.

See the favicon code page for a more detailed explanation of how it works and how to use it in your own pages.

If Kodak had been more like Apple

Finally Kodak has crumbled; technology and the market changed, but Kodak could not keep up. Lots of memories of those bright yellow and black film spools, and memories in photographs piled in boxes beneath the bed.

But just imagine if Kodak had been more like Apple.

I’m wondering about the fallout from the Kodak collapse. I’m not an investor, nor an employee, or even a supplier, but I have used Kodak products since childhood and I do have 40 years of memories in Kodak’s digital photo cloud. There are talks of Fuji buying up the remains of the photo cloud service, so it maybe that they will re-emerge, but for the time being I can no longer stream my photos to friend’s kTV enabled TV sets when I visit, nor view them online.

Happily, my Kodak kReader has a cache of most of my photos. But, how many I’m not sure, when did I last look at the photos of those childhood holidays or my wedding, will they be in my reader, I’ll check my kPhone as well. I’d hate to think I’d lost the snaps of the seaside holiday when my hat blew into the water; I only half remember it, but every time I look at it I remember being told and re-told the story by my dad.

The kReader is only a few months old. I usually try to put off getting a new one as they are so expensive, but even after a couple of years the software updates put a strain on the old machines.  I had to give up when my three year old model seemed to take about a minute to show each photo. It was annoying as this wasn’t just the new photos, but ones I recall viewing instantly on my first photo-reader more than 30 years ago (I can still remember the excitement as I unwrapped it one Christmas, I was 14 at the time, but now children seem to get their first readers when they are 4). The last straw was when the software updates would no longer work on the old processor and all my newer photos were appearing in strange colours.

Some years ago, I’d tried using a Fuji-viewer, which was much cheaper than the Kodak one. In principle you could download your photo cloud collection in an industry standard format and then import them into the Fuji cloud. However, this lost all the notes and dates on the photos and kept timing out unless I downloaded them in small batches, then I lost track of where I was. Even my brother-in-law, who is usually good at this sort of thing, couldn’t help.

But now I’m glad I’ve got the newest model of kReader as it had 8 times the memory of the old one, so hopefully all of my old photos in its cache. But oh no, just thought, has it only cached the things I’ve looked at since I’ve got it?  If so I’ll have hardly anything. Please, please let the kReader have downloaded all it could.

Suddenly, I remember the days when I laughed a little when my mum was still using her reels of old Apple film and the glossy prints that would need scanning to share on the net (not that she did use the net, she’d pop them in the post!). “I know it is the future”, she used to say, “but I never really trust things I can’t hold”. Now I just wish I’d listened to her.

Wikipedia blackout and why SOPA winging gets up my nose

Nobody on the web can be unaware of the Wikipedia blackout, and if they haven’t heard of SOPA or PIPA before will have now.  Few who understand the issues would deny that SOPA and PIPA are misguided and ill-informed, even Apple and other software giants abandoned it, and Obama’s recent statement has effectively scuppered SOPA in its current form.  However, at the risk of apparently annoying everyone, am I the only person who finds some of the anti-SOPA rhetoric at best naive and at times simply arrogant?

Wikipedia Blackout screenshot

The ignorance behind SOPA and a raft of similar legislation and court cases across the world is deeply worrying.  Only recently I posted about the recent NLA case in the UK, that creates potential copyright issues when linking on the web reminiscent of the Shetland Times case nearly 15 years ago.

However, that is no excuse for blinkered views on the other side.

I got particularly fed up a few days ago reading an article “Lockdown: The coming war on general-purpose computing1  by copyright ativist Cory Doctorow based on a keynote he gave at the Chaos Computer Congress.  The argument was that attempts to limit the internet destroyed the very essence of  the computer as a general purpose device and were therefore fundamentally wrong.  I know that Sweden has just recognised Kopimism as a religion, but still an argument that relies on the inviolate nature of computation leaves one wondering.

The article also argued that elected members of Parliament and Congress are by their nature layfolk, and so quite reasonably not expert in every area:

And yet those people who are experts in policy and politics, not technical disciplines, still manage to pass good rules that make sense.

Doctorow has trust in the nature of elected democracy for every area from biochemistry to urban planning, but not information technology, which, he asserts, is in some sense special.

Now even as a computer person I find this hard to swallow, but what would a geneticist, physicist, or even a financier using the Black-Scholes model make of this?

Furthermore, Congress is chastised for finding unemployment more important than copyright, and the UN for giving first regard to health and economics — of course, any reasonable person is expected to understand this is utter foolishness.  From what parallel universe does this kind of thinking emerge?

Of course, Doctorow takes an extreme position, but the Electronic Freedom Foundation’s position statement, which Wikipedia points to, offers no alternative proposals and employs scaremongering arguments more reminiscent of the tabloid press, in particular the claim that:

venture capitalists have said en masse they won’t invest in online startups if PIPA and SOPA pass

This turns out to be a Google sponsored report2 and refers to “digital content intermediaries (DCIs)“, those “search, hosting, and distribution services for digital content“, not startups in general.

When this is the quality of argument being mustered against SOPA and PIPA is there any wonder that Congress is influenced more by the barons of the entertainment industry?

Obviously some, such as Doctorow and more fundamental anti-copyright activists, would wish to see a completely unregulated net.  Indeed, this is starting to be the case de facto in some areas, where covers are distributed pretty freely on YouTube without apparently leading to a collapse in the music industry, and offering new bands much easier ways to make an initial name for themselves.  Maybe in 20 years time Hollywood will have withered and we will live off a diet of YouTube videos :-/

I suspect most of those opposing SOPA and PIPA do not share this vision, indeed Google has been paying 1/2 million per patent in recent acquisitions!

I guess the idealist position sees a world of individual freedom, but it is not clear that is where things are heading.  In many areas online distribution has already resulted in a shift of power from the traditional producers, the different record companies and book publishers (often relatively large companies themselves), to often one mega-corporation in each sector: Amazon, Apple iTunes. For the latter this was in no small part driven by the need for the music industry to react to widespread filesharing.  To be honest, however bad the legislation, I would rather trust myself to elected representatives, than unaccountable multinational corporations3.

If we do not wish to see poor legislation passed we need to offer better alternatives, both in terms of the law of the net and how we reward and fund the creative industries.  Maybe the BBC model is best, high quality entertainment funded by the public purse and then distributed freely.  However, I don’t see the US Congress nationalising Hollywood in the near future.

Of course copyright and IP is only part of a bigger picture where the net is challenging traditional notions of national borders and sovereignty.  In the UK we have seen recent cases where Twitter was used to undermine court injunctions.  The injunctions were in place to protect a few celebrities, so were ‘fair game’ anyway, and so elicited little public sympathy.  However, the Leveson Inquiry has heard evidence from the editor of the Express defending his paper’s suggestion that the McCann’s may have killed their own daughter; we expect and enforce (the Expresss paid £500,000 after a libel case) standards in the print media, would we expect less if the Express hosted a parallel new website in the Cayman Islands?

Whether it is privacy, malware or child pornography, we do expect and need to think of ways to limit the excess of the web whilst preserving its strengths.  Maybe the solution is more international agreements, hopefull not yet more extra-terratorial laws from the US4.

Could this day without Wikipedia be not just a call to protest, but also an opportunity to envision what a better future might be.

  1. blanked out today, see Google cache[back]
  2. By Booz&Co, which I thought at first was a wind-up, but appears to be a real company![back]
  3. As I write this, I am reminded of the  corporation-controlled world of Rollerball and other dystopian SciFi.[back]
  4. How come there is more protest over plans to shut out overseas web sites than there is over unmanned drones performing extra-judicial executions each week.[back]

tread lightly — controlling user experience pollution

When thinking about usability or user experience, it is easy to focus on the application in front of us, but the way it impacts its environment may sometimes be far more critical. However, designing applications that are friendly to their environment (digital and physical) may require deep changes to the low-level operating systems.

I’m writing this post effectively ‘offline’ into a word processor for later upload. I sometimes do this as I find it easier to write without the distractions of editing within a web browser, or because I am physically disconnected from the Internet. However, now I am connected, and indeed I can see I am connected as a FTP file upload is progressing, it is just that anything else network-related is stalled.

The reason that the FTP upload is ‘hogging’ the network is, I believe, due to a quirk in the UNIX scheduling system, which was, paradoxically, originally intended to improve interactivity.

UNIX, which sits underneath Mac OS, is a multiprocessing operating system running many programs at once. Each process has a priority, called its ‘niceness‘, which can be set explicitly, but is also tweaked from moment to moment by the operating system. One of the rules for ‘tweaking’ it is that if a process is IO-bound, that is if it is constantly waiting for input or output, then its niceness is decreased, meaning that it is given higher priority.

The reason for this rule is partly to enhance interactive performance in the old days of command line interfaces; an interactive program would spend lots of time waiting for the user to enter something, and so its priority would increase meaning it would respond quickly as soon as the user entered anything. The other reason is that CPU time was seen as the scarce resource, so that processes that were IO bound were effectively being ‘nicer’ to other processes as they let them get a share of the precious CPU.

The FTP program is simply sitting there shunting out data to the network, so is almost permanently blocked waiting for the network as it can read from the disk faster than the network can transmit data. This means UNIX regards it as ‘nice’ and ups its priority. As soon as the network clears sufficiently, the FTP program is rescheduled and it puts more into the network queue, reads the next chunk from disk until the network is again full to capacity. Nothing else gets a chance, no web, no email, not even a network trace utility.

I’ve seen the same before with a database server on one of Fiona’s machines — all my fault. In the MySQL manual it suggested that you disable indices before large bulk updates (e.g. ingesting a file of data) and then re-enable them once the update is finished as indexing is more efficient on lots of data than one at a time. I duly did this and forgot about it until Fiona noticed something was wrong on the server and web traffic had ground to a near halt. When she opened a console on the server, she found that it seemed quiet, very little CPU load at all, and was puzzled until I realised it was my indexing. Indexing requires a lot of reading and writing data to and from disk, so MySQL became IO-bound, was given higher priority, as soon as the disk was free it was rescheduled, hit the disk once more … just as FTP is now hogging the network, MySQL hogged the disk and nothing else could read or write. Of course MySQL’s own performance was fine as it internally interleaved queries with indexing, it is just everything else on the system that failed.

These are hard scenarios to design for. I have written before (“why software need never hang“) about the way application designers do not think sufficiently about potential delays due to slow networks, or broken connections. However, that was about the applications that are suffering. Here the issue is not that the FTP program is badly designed for its delays, it is still responding very happily, just that it has had a knock on effect on the rest of the system. It is like cleaning your sink with industrial bleach — you have a clean house within, but pollute the watercourse without.

These kind of issues are not related solely to network and disk, any kind of resource is limited and profligacy causes damage in the digital world as much as in the physical environment.

Some years ago I had a Symbian smartphone, but it proved unusable as its battery life rarely exceeded 40 minutes from full charge. I thought I had a duff battery, but later realised it was because I was leaving applications on the phone ‘open’. For me I went to the address book, looked up a number, and that was that, I then maybe turned the phone off or switched  to something else without ‘exiting’ the address book. I was treating the phone like every previous phone I had used, but this one was different, it had a ‘real’ operating system, opening the address book launched the address book application, which then kept on running — and using power — until it was explicitly closed, a model that is maybe fine for permanently plugged in computers, but disastrous for a moble phone.

When early iPhones came out iOS was criticised for being single threaded, that is not having lots of things running in the ‘background’. However, this undoubtedly helped its battery life. Now, with newer versions of iOS, it has changed and there are lots of apps running at once, and I have noticed the battery life reducing, is that simply the battery wearing out with age or the effect of all those apps running?

Power is of course not just a problem for smartphones, but for any laptop. I try to closedown applications on my Mac when I am working without power as I know some programs just eat CPU when they are apparently idle (yes, Firefox, it’s you I’m talking about). And from an environmental point of view, lower power consumption when connected would also be good. My hope was that Apple would take the lessons learnt in the early iOS to change the nature of their mainstream OS, but sadly they succumbed to the pressure to make iOS a ‘proper’ OS!

Of course the FTP program could try to be friendly, perhaps when it is not the selected window deliberately throttle its network activity. But then the 4 hour upload would take 8 hours, instead of 20 minutes left at this point, I’d be looking forward to another 4 hours and 20 minutes, and I’d be complaining about that.

The trouble is that there needs to be better communication, more knowledge shared, between application and operating system. I would like FTP to use all the network capacity that it can, except when I am interacting with some other program. Either FTP needs to say to the OS “hey here’s a packet, send it when there’s a gap”1, or the OS needs some way for applications to determine current network state and make decisions based on that. Sometimes this sort of information is easily available, more often it is either very hard to get at or not available at all.

I recall years ago when internet was still mainly through pay-per-minute dial-up connections. You could set your PC to automatically dial when the internet was needed. However, some programs, such as chat, would periodically check with a central server to see if there was activity, this would cause the PC to dial-up the ISP. If you were lucky the PC also had an auto-disconnect after a period of inactivity, if you were not lucky the PC would connect at 2am and by the morning you’d find yourself with a phone bill more than your weeks’ wages.

When we were designing onCue at aQtive, we wanted to be able to connect to the Internet when it was available, but avoid bankrupting our users. Clearly somewhere in the TCP/IP stack, the layers of code over the network, at some level deep down it knew whether we were connected. I recall we found a very helpful function in the Windows API called something like “isConnected”2. Unfortunately, it worked by attempting to send a network packet and returning true if it succeeded and false if it failed. Of course sending the test packet caused the PC to auto-dial …

And now there is just 1 minute and 53 seconds left on the upload, so time to finish this post before I get on to garbage collection.

  1. This form of “send when you can” would also be useful in cellular networks, for example when syncing photos.[back]
  2. I had a quick peek, and fund that Windows CE has a function called InternetGetConnectedState.  I don’t know if this works better now.[back]

New Year and New Job

It is a New Year and I am late with my Christmas crackers again!

If you are expecting the annual virtual cracker from me it is coming … but maybe not before Twelfth Night :-/

The New Year is bringing changes, not least, as many already know, I am moving my academic role and taking up a part-time post as professor down in Birmingham University.

At Birmingham I will be joining an established and vibrant HCI centre, including long-term colleague and friend Russell Beale.  The group has recently had substantial  investment from the University leading to several new appointments including Andrew Howes (who coincidentally also has past Lancaster connections).

The reasons for the move are partly to join this exciting group and partly to simplify life as Talis is based in Birmingham, so just one place to travel to regularly, and one of my daughters also there.

Of course this also means I will be leaving many dear colleagues and friends at Lancaster, but I do expect to continue to work with many and am likely to retain a formal or informal role there for some time.

As well as moving institutions I am also further reducing my percentage of academic time — typically I’ll be just one day a week academic.  So, apologies in advance if my email responses becomes even more sporadic and I turn down (or fail to answer :-() requests for reviews, PhD exams, etc.

Although moving institutions, I will, of course, continue to live up in Tiree (wild and windy, but, at the moment, so is everywhere!), so will still be travelling up and down the country; I’ll wave as I pass!

… and there will be another Tiree Tech Wave in March 🙂

After the Tech Wave is over

The Second Tiree Tech Wave is over.   Yesterday the last participants left by ferry and plane and after a final few hours tidying, the Rural Centre, which the day before had been a tangle of wire and felt, books and papers, cups and biscuit packets, is now as it had been before.  And as I left, the last boxes under my arm, it was strangely silent with only the memory of voices and laughter in my mind.

So is it as if it had never been?  I there anything left behind?  There are a few sheets of Magic Whiteboard on the walls, that I left so that those visiting the Rural Centre in the coming weeks can see something of what we were doing, and there are used teabags and fish-and-chip boxes in the bin, but few traces.

We trod lightly, like the agriculture of the island, where Corncrake and orchid live alongside sheep and cattle.

Some may have heard me talk about the way design is like a Spaghetti Western. In the beginning of the film Clint Eastwood walks into the town, and at the end walks away.  He does not stay, happily ever after, with a girl on his arm, but leaves almost as if nothing had ever happened.

But while he, like the designer, ultimately leaves, things are not the same.  The Carson brothers who had the town in fear for years lie dead in their ranch at the edge of town, the sharp tang of gunfire still in the air and the buzz of flies slowly growing over the elsewise silent bodies.  The crooked major, who had been in the pocket of the Carson brothers, is strapped over a mule heading across the desert towards Mexico, and not a few wooden rails and water buts need to be repaired.  The job of the designer is not to stay, but to leave, but leave change: intervention more than invention.

But the deepest changes are not those visible in the bullet-pocked saloon door, but in the people.  The drunk who used to sit all day at the bar, has discovered that he is not just a drunk, but he is a man, and the barmaid, who used to stand behind the bar has discovered that she is not just a barmaid, but she is a woman.

This is true of the artefacts we create and leave behind as designers, but much more so of the events, which come and go through our lives.  It is not so much the material traces they leave in the environment, but the changes in ourselves.

I know that, as the plane and ferry left with those last participants, a little of myself left with them, and I know many, probably all, felt a little of themselves left behind on Tiree.  This is partly abut the island itself; indeed I know one participant was already planning a family holiday here and another was looking at Tiree houses for sale on RightMove!  But it was also the intensity of five, sometimes relaxed, sometimes frenetic, days together.

So what did we do?

There was no programme of twenty minute talks, no keynotes or demo, indeed no plan nor schedule at all, unusual in our diary-obsessed, deadline-driven world.

Well, we talked.  Not at a podium with microphone and Powerpoint slides, but while sitting around tables, while walking on the beach, and while standing looking up at Tilly, the community wind turbine, the deep sound of her swinging blades resonating in our bones.  And we continued to talk as the sun fell and the overwhelmingly many stars came out , we talked while eating, while drinking and while playing (not so expertly) darts.

We met people from the island those who came to the open evening on Saturday, or popped in during the days, and some at the Harvest Service on Sunday.  We met Mark who told us about the future plans for Tiree Broadband, Jane at PaperWorks who made everything happen, Fiona and others at the Lodge who provided our meals, and many more. Indeed, many thanks to all those on the island who in various ways helped or made those at TTW feel welcome.

We also wrote.  We wrote on sheets of paper, notes and diagrams, and filled in TAPT forms for Clare who was attempting unpack our experiences of peace and calmness in the hope of designing computer systems that aid rather than assault our solitude.  Three large Magic Whiteboard sheets were entitled “I make because …”, “I make with …”, “I make …” and were filled with comments.  And, in these days of measurable objectives, I know that at least a grant proposal, book chapter and paper were written during the long weekend; and the comments on the whiteboards and experiences of the event will be used to create a methodological reflection of the role of making in research which we’ll put into Interfaces and the TTW web site.

We moved.  Walking, throwing darts, washing dishes, and I think all heavily gesturing with our hands while taking.  And became more aware of those movements during Layda’s warm-up improvisation exercises when we mirrored one another’s movements, before using our bodies in RePlay to investigate issues of creativity and act out the internal architecture of Magnus’ planned digital literature system.

We directly encountered the chill of wind and warmth of sunshine, the cattle and sheep, often on the roads as well as in the fields.  We saw on maps the pattern of settlement on the island and on display boards the wools from different breeds on the island. Some of us went to the local historical centre, An Iodhlann [[ http://www.aniodhlann.org.uk/ ]], to see artefacts, documents and displays of the island in times past, from breadbasket of the west of Scotland to wartime airbase.

We slept.  I in my own bed, some in the Lodge, some in the B&B round the corner, Matjaz and Klem in a camper van and Magnus – brave heart – in a tent amongst the sand dunes.  Occasionally some took a break and dozed in the chairs at the Rural Centre or even nodded off over a good dinner (was that me?).

We showed things we had brought with us, including Magnus’ tangle of wires and circuit boards that almost worked, myself a small pack of FireFly units (enough to play with I hope in a future Tech Wave), Layda’s various pieces she had made in previous tech-arts workshops, Steve’s musical instrument combining Android phone and cardboard foil tube, and Alessio’s impressively modified table lamp.

And we made.  We do after all describe this as a making event!  Helen and Claire explored the limits of ZigBee wireless signals.  Several people contributed to an audio experience using proximity sensors and Arduino boards, and Steve’s CogWork Chip: Lego and electronics, maybe the world’s first mechanical random-signal generator.  Descriptions of many of these and other aspects of the event will appear in due course on the TTW site and participants’ blogs.


But it was a remark that Graham made as he was waiting in the ferry queue that is most telling.  It was not the doing that was central, the making, even the talking, but the fact that he didn’t have to do anything at all.  It was the lack of a plan that made space to fill with doing, or not to do so.

Is that the heart?  We need time and space for non-doing, or maybe even un-doing, unwinding tangles of self as well as wire.

There will be another Tiree Tech Wave in March/April, do come to share in some more not doing then.

Who was there:

  • Alessio Malizia – across the seas from Madrid, blurring the boundaries between information, light and space
  • Helen  Pritchard – artist, student of innovation and interested in cows
  • Claire  Andrews – roller girl and researching the design of assistive products
  • Clare  Hooper – investigating creativity, innovation and a sprinkling of SemWeb
  • Magnus  Lawrie – artist, tent-dweller and researcher of digital humanities
  • Steve Gill – designer, daredevil and (when he can get me to make time) co-authoring book on physicality TouchIT
  • Graham Dean – ex-computer science lecturer, ex-businessman, and current student and auto-ethnographer of maker-culture
  • Steve Foreshaw – builder, artist, magician and explorer of alien artefacts
  • Matjaz Kljun – researcher of personal information and olive oil maker
  • Layda Gongora – artist, curator, studying improvisation, meditation and wild hair
  • Alan Dix – me

The Great Apple Apartheid

In days gone by boarding houses and shops had notices saying “Irish and Blacks not welcome“.  These days are happily long past, but today Apple effectively says “poor and rural users not welcome“.

This is a story about Apple and the way its delivery policies exacerbate the digital divide and make the poor poorer.  To be fair, similar stories can be told about other software vendors, and it is hardly news that success in business is often at the expense of the weak and vulnerable.  However, Apple’s decision to deliver Lion predominantly via App store is an iconic example of a growing problem.

I had been using Lion for a little over a week, not downloaded from App Store, but pre-installed on a brand new MacBook Air.  However, whenever I plugged in my iPhone and tried to sync a message appeared saying the iTunes library was created with a newer version of iTunes and so iTunes needed to be updated.  Each time I tried to initiate the update as requested, it started  a long slow download dialogue, but some time later told me that the update had failed.

This at first seemed all a little odd on a brand new machine, but I think the reason is as follows:

  1. When I first initialised the new Air I chose to have it sync data with a Time Machine backup from my previous machine.
  2. The iTunes on the old machine was totally up-to-date due to regular updates.
  3. Apple dealers do not bother to update machines before they are delivered.
  4. The hotel WiFi connection did not have sufficient throughput for a successful update.

From an engineering point of view, the fragility of the iTunes library format is worrying; many will recall the way HyperCard was able to transfer stacks back and forth between versions without loss.  Anyway the paucity of engineering in recent software is a different story!

It is the fact that the hotel WiFi was in sufficient for the update that concerns me here.  It was fast enough to browse the web, without apparent delay, to check email etc.  Part of the problem was that the hotel did offer two levels of service, one (more expensive!) aimed more at heavy multimedia use, so maybe that would have been sufficient.  The essential update for the brand new machine consisted of 1.46 gigabytes of data, so perhaps not surprising the poor connection faltered.

I have been concerned for several years at the ever increasing size of regular software updates, which have increased from 100 Mbytes to now often several Gbytes1.  Usually these happen in the background and I have reasonable broadband at home, so they don’t cause me any problems personally, but I wonder about those with less good broadband, or those whose telephone exchanges do not support broadband at all.  In the UK, this is mainly those outside major urban areas, who are out of reach of cable and fibre super-broadband and reliant on old BT copper lines.  Thinking more broadly across the world, how many in less developed countries or regions will be able to regularly update software?

Of course old versions may well run better on old computers, but without updates it is not just that users cannot benefit from new features, but more critically they are missing essential security updates leaving the vulnerable to attack.

And this is not just a problem for those directly affected, but for us all, as it creates a fertile ground for bot armies to launch denial of service attacks and other forms of cybercrime or cyberterrorism.   Each compromised machine is a future cyberwarrior or cybergangster.

However, the decision of Apple to launch Lion predominantly via App Store has significantly upped the stakes.   Those with slower broadband connections may be able to manage updates, but the full operating system is an order of magnitude larger.  Of course those with slower connections tend to be the poorer, more vulnerable, more marginalised; those without jobs, in rural areas, the elderly.  It is as if Apple has put up a big notice:

To the poor and weak
we don’t want you

To be fair, Lion is (one feels grudgingly) also made available on USB drives, but at more than twice the price of the direct download2.  So this is not entirely shutting the door on the poor, but only letting them in if they pay extra.  A tax on poverty.

Of course, this is not a deliberate act of aggression against the weak, just the normal course of business.  The cheapest and easiest way to deliver software, and one that incidentally ensures that all revenue goes to Apple, is through direct online sales.  The USB option adds complexity and cost to the distribution systems and Apple seem to be pricing to discourage use.  This, like so many other ways in which the poor pay more, is just an ‘accident’ of the market economy.

But for a company that prides itself in design, surely things could be done more creatively?

One way would be to split software into two parts.  One small part would be the ‘key’, essential to run it, but very small,  The second part would constitute the bulk of the software, but be unusable without the ‘key’.   The ‘key’ would then be sold solely on the App store, but would be small enough for anyone to download.  The rest would be also made available online, but for free download and with a licence that allows third party distribution (and of course be suitably signed/encrypted to prevent tampering).  Institutions or cybercafes could download it to local networks, entrepreneurs could sell copies on DVD or USB, but competition would mean this would be likely to end up far cheaper than Apple’s USB premium, close to the cost of the medium, with a small margin.

Of course the same method could be used for any software, not just Lion, and indeed even for software updates.

I’m sure Apple could think of alternative, maybe better, solutions.  The problem is just that Apple’s designers, despite inordinate consideration for the appearance and appeal of their products, have simply not thought beyond the kind of users they meet in the malls of Cupertino.

  1. Note, this is not an inevitable consequence of increasing complexity and (itself lamentable) code bloat.  In the past software updates were often delivered as ‘deltas’, the changes between old and new.  It seems that now an ‘update’ is in fact complete copies of entire major components.[back]
  2. At the tiem of wrting tjis Mac OSX LIon is available for  app store for $29.99, but USB thumb drive version is $69.99[back]

TTW2 – the second Tiree Tech Wave is approaching

It is a little over a month (3-7 Nov)  until the next Tiree Tech Wave 🙂  However, as I’m going to be off-island most of the time until the end of October, it seems very close indeed!

The first registrations are in, including Clare flying straight here from the US1 and Alessio coming from Madrid; mind you last time Azizah had come all the way from Malaysia, so still looking very parochial in comparison!

While I don’t expect we will be oversubscribed, do ‘book early’ (before Oct 10th) if you intend to come to help us plan things and make sure you get your preferred accommodation (the tent at the end of my garden is draughty in November) and travel.

If you want to take advantage of the island’s watersports, catch me in one place for more than a day, or simply hang out, do take a few extra days before or after the event.  One person has already booked to arrive a couple of days early and others maybe also.

To see what the Tech Wave will be like see the Interfaces report … although it is the people who make the event, so I’m waiting to be surprised again this time round 🙂

Looking forward to seeing you.

  1. In fact guided over the ocean by the ‘golf ball’ on Tiree, which is the North Atlantic civil radar.[back]

Death by Satellite

The Upper Atmosphere Research Satellite (UARS) is on its way down after 20 years zipping by at 375 miles above our heads.  As the bus-sized satellite breaks up parts will reach earth and NASA reassuringly tell us that there is only a 1 in 3,200 chance that anyone will be hit.  Given being hit by a piece of satellite is likely to be painful and most likely terminal, I wonder if I should be worried.

With a world population of 6,963,070,0291, that is around one in a trillion chance that I will die from UARS this year.  Given the annual risk from asteroid impact or shark attack is around one in in 2 billion2, that sounds quite good for UARS (but must buy that shark repellent from Boots).

Of course, it is a bit unfair comparing the UARS that has been up there for 20 years spinning round the world like frenzy, with more mundane day-to-day risks like crossing the road.  For air travel they take into account the distance travelled and aim for safety factors around 1 accident (but with a lot of people in the aeroplane) every hundred million flying miles and achieving a figure about 10 times better than that3.

At 375 miles the UARS will have been orbiting at 7.55978 km/s4, so travelled 2.9 billion miles in the last 20 years.  That means it is causing one death in 10 trillion miles travelled … five thousand times safer than air flight, 120 million times safer than car travel5, and around million times safer than bicycle6.  I must cancel my KLM ticket home and get one by satellite.

  1. World population of 6,963,070,029 at 5:14 UTC (EST+5) Sep 19, 2011 according to US Census Bureau World Population Clock [back]
  2. Scientific American, “Competing Catastrophes: What’s the Bigger Menace, an Asteroid Impact or Climate Change?“, Robin Lloyd, March 31, 2010 [back]
  3. Wikipedia Air Safety page quotes different, but close numbers: 3 deaths per 10 billion passenger miles, one death in 20 billion passenger miles and 0.05 deaths per billion passenger kilometers.[back]
  4. CalcTool Earth Orbit Calculator[back]
  5. Based on UK figures of 3,431 deaths per year (US NHTSA) and 26.7 billion miles driven in the UK per year (Admiral Insurance).[back]
  6. Wikipedia Air Safety statistics[back]

book: The Unfolding of Language, Deutscher

I have previously read Guy Deutscher‘s “Through the Language Glass“, and have now, topsy turvy, read his earlier book “The Unfolding of Language“.  Both are about language, “The Unfolding of Language” about the development of the complexity of language that we see today from simpler origins, and “Through the Language Glass” about the interaction between language and thought.  Both are full of sometimes witty and always fascinating examples drawn from languages around the world, from the Matses in the Amazon to Ancient Sumarian.

I recall my own interest in the origins of language began young, as a seven year old over breakfast one day, asking whether ‘night, was a contraction of ‘no light’.  While this was an etymological red herring, it is very much the kind of change that Deutscher documents in detail showing the way a word accretes beginnings and ending through juxtaposition of simpler words followed by erosion of hard to pronounce sounds.

One of my favourites examples was the French “aujourd’hui”.  The word ‘hui, was Old French for ‘today’, but was originally Latin “hoc die”, “(on) this day”. Because ‘hui’ is not very emphatic it became “au jour d’hui”, “on the day of this day” , which contracted to the current ‘aujourd’hui’. Except now to add emphasis some French speakers are starting to say “au jour aujourd’hui”, “on the day on the day of this day”!  This reminds me of Longsleddale in the Lake District (inspiration for Postman Pat‘s Greendale),  a contraction of “long sled dale”, which literally means “long valley valley” from Old English “slaed” meaning “valley” … although I once even saw something suggesting that ‘long’ itself in the name was also “valley” in a different language!

Deutscher gives many more prosaic examples where words meaning ‘I’, ‘you’, ‘she’ get accreted to verbs to create the verb endings found in languages such as French, and how prepositions (themselves metaphorically derived from words like ‘back’) were merged with nouns to create the complex case endings of Latin.

However, the most complex edifice, which Deutscher returns to repeatedly, is that of the Semitic languages with a template system of vowels around three-consonant roots, where the vowel templates change the meaning of the root.  To illustrate he uses the (fictional!) root ‘sng’ meaning ‘to snog’ and discusses how first simple templates such as ‘snug’ (“I snogged”) and then more complex constructions such as ‘hitsunnag’ (“he was made to snog himself”) all arose from simple processes of combination, shortening and generalisation.

“The Unfolding of Language” begins with the 19th century observation that all languages seem to be in a process of degeneration where more complex  forms such as the Latin case system or early English verb endings are progressively simplified and reduced. The linguists of the day saw all languages in a state of continuous decay from an early linguistic Golden Age. Indeed one linguist, August Schleicher, suggested that there was a process where language develops until it is complex enough to get things done, and only then recorded history starts, after which the effort spent on language is instead spent in making history.

As with geology, or biological evolution, the modern linguist rejects this staged view of the past, looking towards the Law of Uniformitarianism, things are as they have always been, so one can work out what must have happened in the pre-recorded past by what is happening now.  However, whilst generally finding this convincing, throughout the book I had a niggling feeling that there is a difference.  By definition, those languages for which we have written records are those of large developed civilisations, who moreover are based on writing. Furthermore I am aware that for biological evolution small isolated groups (e.g. on islands or cut off in valleys) are particularly important for introducing novelty into larger populations, and I assume the same would be true of languages, but somewhat stultified by mass communication.

Deutscher does deal with this briefly, but right at the very end in a short epilogue.  I feel there is a whole additional story about the interaction between culture and the grammatical development of language.  I recall in school a teacher explained how in Latin the feminine words tended to belong to the early period linked to agriculture and the land, masculine words for later interests in war and conquest, and neuter for the still later phase of civic and political development. There were many exceptions, but even this modicum of order helped me to make sense of what otherwise seemed an arbitrary distinction.

The epilogue also mentions that the sole exception to the ‘decline’ in linguistic complexity is Arabic with its complex template system, still preserved today.

While reading the chapters about the three letter roots, I was struck by the fact that both Hebrew an Arabic are written as consonants only with vowels interpolated by diacritical marks or simply remembered convention (although Deutscher does not mention this himself). I had always assumed that this was like English where t’s pssble t rd txt wth n vwls t ll. However, the vowels are far more critical for Semitic languages where the vowel-less words could make the difference between “he did it” and “it will be done to him”.  Did this difference in writing stem from the root+template system, or vice versa, or maybe they simply mutually reinforced each other?

The other factor regarding Arabic’s remarkable complexity must surely be the Quran. Whereas the Bible was read for a over a millennium in Latin, a non-spoken language, and later translated focused on the meaning; in contrast there is a great emphasis on the precise form of the Quran together with continuous lengthy recitation.  As the King James Bible has been argued to have been a significant influence on modern English since the 17th century, it seems likely the Quran has been a factor in preserving Arabic for the last 1500 years.

Early in “The Unfolding of Language” Deutscher dismisses attempts to look at the even earlier prehistoric roots of language as there is no direct evidence. I assume that this would include Mithin’s “The Singing Neanderthals“, which I posted about recently. There is of course a lot of truth in this criticism; certainly Mithin’s account included a lot of guesswork, albeit founded on paleontological evidence.  However, Deutscher’s own arguments include extrapolating to recent prehistory. These extrapolations are based on early written languages and subsequent recorded developments, but also include guesswork between the hard evidence, as does the whole family-tree of languages.  Deutscher was originally a Cambridge mathematician, like me, so, perhaps unsurprisingly, I found his style of argument convincing. However, given the foundations on Uniformitarianism, which, as noted above, is at best partial when moving from history to pre-history, there seems more of  a continuum rather than sharp distinction between the levels of interpretation and extrapolation in this book and Mithin’s.

Deutscher’s account seeks to fill in the gap between the deep prehistoric origins of protolanguage (what Deutscher’s calls ‘me Tarzan’ language) and its subsequent development in the era of media-society (starting 5000BC with extensive Sumerian writing). Rather than seeing these separately, I feel there is a rich account building across various authors, which will, in time, yield a more complete view of our current language and its past.