changing rules of copyright on the web – the NLA case

I’ve been wondering about the broader copyright implications of a case that went through the England and Wales Court of Appeal earlier this year.  The case was brought by  the NLA (Newspaper Licensing Agency) against Meltwater, who run commercial media-alert services; for example telling  you or your company when and where you have been mentioned in the press.

While the case is specifically about a news service, it appears to have  broader implications for the web, not least because it makes new judgements on:

  • the use of titles/headlines — they are copyright in their own right
  • the use of short snippets (in this case no more than 256 characters) — they too potentially infringe copyright
  • whether a URL link is sufficient acknowledgement of copyright material for fair use – it isn’t!

These, particularly the last, seems to have implications for any form of publicly available lists, bookmarks, summaries, or even search results on the web.  While NLA specifically allow free services such as Google News and Google Alerts, it appears that this is ‘grace and favour’, not use by right.   I am reminded of the Shetland case1, which led to many organisations having paranoid policies regarding external linking (e.g. seeking explicit permission for every link!).

So, in the UK at least, web law copyright law changed significantly through precedent, and I didn’t even notice at the time!

In fact, the original case was heard more than a year ago November 2010 (full judgement) and then the appeal in July 2011 (full judgement), but is sufficiently important that the NLA are still headlining it on their home page (see below, and also their press releases (PDF) about the original judgement and appeal).  So effectively things changed at least at that point, although as this is a judgement about law, not new legislation, it presumably also acts retrospectively.  However, I only recently became aware of it after seeing a notice in The Times last week – I guess because it is time for annual licences to be renewed.

Newspaper Licensing Agency (home page) on 26th Dec 2011

The actual case was, in summary, as follows. Meltwater News produce commercial media monitoring services, that include the title, first few words, and a short snippet of  news items that satisfy some criteria, for example mentioning a company name or product.  NLA have a license agreement for such companies and for those using such services, but Meltwater claimed it did not need such a license and, even if it did, its clients certainly did not require any licence.  However, the original judgement and the appeal found pretty overwhelmingly in favour of NLA.

In fact, my gut feeling in this case was with the NLA.  Meltwater were making substantial money from a service that (a) depends on the presence of news services and (b) would, for equivalent print services, require some form of licence fee to be paid.  So while I actually feel the judgement is fair in the particular case, it makes decisions that seem worrying when looked at in terms of the web in general.

Summary of the judgement

The appeal supported the original judgement so summarising the main points from the latter (indented text quoting from the text of the judgement).

Headlines

The status of headlines (and I guess by extension book titles, etc.) in UK law are certainly materially changed by this ruling (para 70/71), from previous case law (Fairfax, Para. 62).

Para. 70. The evidence in the present case (incidentally much fuller than that before Bennett J in Fairfax -see her observations at [28]) is that headlines involve considerable skill in devising and they are specifically designed to entice by informing the reader of the content of the article in an entertaining manner.

Para. 71. In my opinion headlines are capable of being literary works, whether independently or as part of the articles to which they relate. Some of the headlines in the Daily Mail with which I have been provided are certainly independent literary works within the Infopaq test. However, I am unable to rule in the abstract, particularly as I do not know the precise process that went into creating any of them. I accept Mr Howe’s submission that it is not the completed work as published but the process of creation and the identification of the skill and labour that has gone into it which falls to be assessed.

Links and fair use

The ruling explicitly says that a link is not sufficient acknowledgement in terms of fair use:

Para. 146. I do not accept that argument either. The Link directs the End User to the original article. It is no better an acknowledgment than a citation of the title of a book coupled with an indication of where the book may be found, because unless the End User decides to go to the book, he will not be able to identify the author. This interpretation of identification of the author for the purposes of the definition of “sufficient acknowledgment” renders the requirement to identify the author virtually otiose.

Links as copies

Para 45 (not part of the judgement, but part of NLA’s case) says:

Para. 45. … By clicking on a Link to an article, the End User will make a copy of the article within the meaning of s. 17 and will be in possession of an infringing copy in the course of business within the meaning of s. 23.

The argument here is that the site has some terms and conditions that say it is not for ‘commercial user’.

As far as I can see the judge equivocates on this issue, but happily does not seem convinced:

Para 100. I was taken to no authority as to the effect of incorporation of terms and conditions through small type, as to implied licences, as to what is commercial user for the purposes of the terms and conditions or as to how such factors impact on whether direct access to the Publishers’ websites creates infringing copies. As I understand it, I am being asked to take a broad brush approach to the deployment of the websites by the Publishers and the use by End Users. There is undoubtedly however a tension between (i) complaining that Meltwater’s services result in a small click-through rate (ii) complaining that a direct click to the article skips the home page which contains the link to the terms and conditions and (iii) asserting that the End Users are commercial users who are not permitted to use the websites anyway.

Free use

Finally, the following extract suggests that NLA would not be seeking to enforce the full licence on certain free services:

Para. 20. The Publishers have arrangements or understandings with certain free media monitoring services such as Google News and Google Alerts whereby those services are currently licensed or otherwise permitted. It would apparently be open to the End Users to use such free services, or indeed a general search engine, instead of a paid media monitoring service without (currently at any rate) encountering opposition from the Publishers. That is so even though the End Users may be using such services for their own commercial purposes. The WEUL only applies to customers of a commercial media monitoring service.

Of course, the fact that they allow it without licence, suggests they feel the same copyright rules do apply, that is the search collation services are subject to copyright.  The judge does not make a big point of this piece of evidence in any way, which would suggest that these free services do not have a right to abstract and link.  However, the fact that Meltwater (the agency NA is acting against) is making substantial money was clearly noted by the judge, as was the fact that users could choose to use alternative services free.

Thinking about it

As noted my gut feeling is that fairness goes to the newspapers involved; news gathering and reportingis costly, and openly accessible online newspapers are of benefit to us all; so, if news providers are unable to make money, we all lose.

Indeed, years ago in dot.com days, at aQtive we were very careful that onCue, our intelligent internet sidebar, did not break the business models of the services we pointed to. While we effectively pre-filled forms and submitted them silently, we did not scrape results and present these directly, but instead sent the user to the web page that provided the information.  This was partly out a feeling that this was the right and fair thing to do, partly because if we treated others fairly they would be happy for us to provide this value-added service on top of what they provided, and partly because we relied on these third-party services for our business, so our commercial success relied on theirs.

This would all apply equally to the NLA v. Meltwater case.

However, like the Shetland case all those years ago, it is not the particular of the case that seems significant, but the wide ranging implications.  I, like so many others, frequently cite web materials in blog posts, web pages and resource lists by title alone with the words live and pointing to the source site.  According to this judgement the title is copyright, and even if my use of it is “fair use” (as it normally would be), the use of the live link is NOT sufficient acknowledgement.

Maybe, things are not quite so bad as they seem. In the NLA vs. Meltwater case, the NLA had a specific licence model and agreement.  The NLA were not seeking retrospective damages for copyright infringement before this was in place, merely requiring that Meltwater subscribe fully to the licence.  The issue was not that just that copyright had been infringed, but that it had been when there was a specific commercial option in place.  In UK copyright law, I believe, it is not sufficient to say copyright has been infringed, but also to show that the copyright owner has been materially disadvantaged by the infringement; so, the existence of the licence option was probably critical to the specific judgement.   However the general principles probably apply to any case where the owner could claim damage … and maybe claim so merely in order to seek an out-of-court settlement.

This case was resolved five months ago, and I’ve not heard of any rush of law firms creating vexatious copyright claims.  So maybe there will not be any long-lasting major repercussions from the case … or maybe the storm is still to come.

Certainly, the courts have become far more internet savvy since the 1990s, but judges can only deal with the laws they are give, and it is not at all clear that law-makers really understand the implications of their legislation on the smooth running of the web.

  1. This was the case in the late 1990s where the Shetland Times sued the Shetland News for including links to its articles.  Although the particular case involved material that appeared to be re-badged, the legal issues endangered the very act of linking at all. See NUJ Freelance “NUJ still supports Shetland News in internet case“, BBC “Shetland Internet squabble settled out of court“, The Lawyer “Shetland Internet copyright case is settled out of court“[back]

ignorance or misinformation – the press and higher education

I guess I shouldn’t be surprised at poor reporting in the Mail, but it does feel slightly more serious than the other tabloids.  I should explain I have a copy of the Mail as it was the only UK paper when I got on the Malaysian Airlines plane in Kuala Lumpur on Tuesday evening, and it is the Monday copy as I assume it had flown out of the UK on the flight the day before!

Deepish inside, p22, the article was “UK students lose out in sciences” by Nick Mcdermott.  The article quotes a report by Civitas that shows that while the annual number of students in so called STEM (Science, Technology, Engineering and Maths) courses rose by around 6500 in the 10 years 1997-2007, in fact this is largely due to an increase of 12,308 in overseas students and a fall in UK students of nearly 6000.  Given an overall increase in student numbers of 600,000 in this period and employers “calling for more science graduates”, the STEM drop is particularly marked.

While the figures I assume are correct, the Mail article leaves the false impression that the overseas students are in some way taking places from the UK students, indeed the article’s title “UK students lose out” suggests precisely this.  I can’t work out if this is simply the writer’s ignorance of the UK higher education system, or deliberate misinformation — neither are good news for British journalism.

Of course, the truth is precisely the opposite.  Overseas students are not in competition with UK students for undergraduate places in STEM or other subjects, as the number of UK students is effectively controlled by a combination of Government quotas and falling student demand in STEM subjects.  The latter, a disinterest in the traditionally ‘hard’ subjects by University applicants, has led to the closure of several university science departments across the country.  Rather than competing with UK students, the presence of overseas students makes courses more likely to be viable and thus preserves the variety of education available for UK students.  Furthermore, the higher fees for overseas students compared with the combined student fees and government monies for UK students, means that, if anything, these overseas students subsidise their UK colleagues.

We should certainly be asking why it is that an increasing number of overseas students value the importance of a science/engineering training while their British counterparts eschew these areas.  However, the blame for the lack of UK engineering graduates does not lie with the overseas students, but closer to home.  Somehow in our school system and popular culture we have lost a sense of the value of a deep scientific education.  Until this changes and UK students begin to apply for these subjects, we cannot expect there to be more UK graduates.  In the mean time, we can only hope that there will be more overseas students coming to study in the UK and keep the scientific and engineering expertise of universities alive until our own country finally comes to its senses.

After the Tech Wave is over

The Second Tiree Tech Wave is over.   Yesterday the last participants left by ferry and plane and after a final few hours tidying, the Rural Centre, which the day before had been a tangle of wire and felt, books and papers, cups and biscuit packets, is now as it had been before.  And as I left, the last boxes under my arm, it was strangely silent with only the memory of voices and laughter in my mind.

So is it as if it had never been?  I there anything left behind?  There are a few sheets of Magic Whiteboard on the walls, that I left so that those visiting the Rural Centre in the coming weeks can see something of what we were doing, and there are used teabags and fish-and-chip boxes in the bin, but few traces.

We trod lightly, like the agriculture of the island, where Corncrake and orchid live alongside sheep and cattle.

Some may have heard me talk about the way design is like a Spaghetti Western. In the beginning of the film Clint Eastwood walks into the town, and at the end walks away.  He does not stay, happily ever after, with a girl on his arm, but leaves almost as if nothing had ever happened.

But while he, like the designer, ultimately leaves, things are not the same.  The Carson brothers who had the town in fear for years lie dead in their ranch at the edge of town, the sharp tang of gunfire still in the air and the buzz of flies slowly growing over the elsewise silent bodies.  The crooked major, who had been in the pocket of the Carson brothers, is strapped over a mule heading across the desert towards Mexico, and not a few wooden rails and water buts need to be repaired.  The job of the designer is not to stay, but to leave, but leave change: intervention more than invention.

But the deepest changes are not those visible in the bullet-pocked saloon door, but in the people.  The drunk who used to sit all day at the bar, has discovered that he is not just a drunk, but he is a man, and the barmaid, who used to stand behind the bar has discovered that she is not just a barmaid, but she is a woman.

This is true of the artefacts we create and leave behind as designers, but much more so of the events, which come and go through our lives.  It is not so much the material traces they leave in the environment, but the changes in ourselves.

I know that, as the plane and ferry left with those last participants, a little of myself left with them, and I know many, probably all, felt a little of themselves left behind on Tiree.  This is partly abut the island itself; indeed I know one participant was already planning a family holiday here and another was looking at Tiree houses for sale on RightMove!  But it was also the intensity of five, sometimes relaxed, sometimes frenetic, days together.

So what did we do?

There was no programme of twenty minute talks, no keynotes or demo, indeed no plan nor schedule at all, unusual in our diary-obsessed, deadline-driven world.

Well, we talked.  Not at a podium with microphone and Powerpoint slides, but while sitting around tables, while walking on the beach, and while standing looking up at Tilly, the community wind turbine, the deep sound of her swinging blades resonating in our bones.  And we continued to talk as the sun fell and the overwhelmingly many stars came out , we talked while eating, while drinking and while playing (not so expertly) darts.

We met people from the island those who came to the open evening on Saturday, or popped in during the days, and some at the Harvest Service on Sunday.  We met Mark who told us about the future plans for Tiree Broadband, Jane at PaperWorks who made everything happen, Fiona and others at the Lodge who provided our meals, and many more. Indeed, many thanks to all those on the island who in various ways helped or made those at TTW feel welcome.

We also wrote.  We wrote on sheets of paper, notes and diagrams, and filled in TAPT forms for Clare who was attempting unpack our experiences of peace and calmness in the hope of designing computer systems that aid rather than assault our solitude.  Three large Magic Whiteboard sheets were entitled “I make because …”, “I make with …”, “I make …” and were filled with comments.  And, in these days of measurable objectives, I know that at least a grant proposal, book chapter and paper were written during the long weekend; and the comments on the whiteboards and experiences of the event will be used to create a methodological reflection of the role of making in research which we’ll put into Interfaces and the TTW web site.

We moved.  Walking, throwing darts, washing dishes, and I think all heavily gesturing with our hands while taking.  And became more aware of those movements during Layda’s warm-up improvisation exercises when we mirrored one another’s movements, before using our bodies in RePlay to investigate issues of creativity and act out the internal architecture of Magnus’ planned digital literature system.

We directly encountered the chill of wind and warmth of sunshine, the cattle and sheep, often on the roads as well as in the fields.  We saw on maps the pattern of settlement on the island and on display boards the wools from different breeds on the island. Some of us went to the local historical centre, An Iodhlann [[ http://www.aniodhlann.org.uk/ ]], to see artefacts, documents and displays of the island in times past, from breadbasket of the west of Scotland to wartime airbase.

We slept.  I in my own bed, some in the Lodge, some in the B&B round the corner, Matjaz and Klem in a camper van and Magnus – brave heart – in a tent amongst the sand dunes.  Occasionally some took a break and dozed in the chairs at the Rural Centre or even nodded off over a good dinner (was that me?).

We showed things we had brought with us, including Magnus’ tangle of wires and circuit boards that almost worked, myself a small pack of FireFly units (enough to play with I hope in a future Tech Wave), Layda’s various pieces she had made in previous tech-arts workshops, Steve’s musical instrument combining Android phone and cardboard foil tube, and Alessio’s impressively modified table lamp.

And we made.  We do after all describe this as a making event!  Helen and Claire explored the limits of ZigBee wireless signals.  Several people contributed to an audio experience using proximity sensors and Arduino boards, and Steve’s CogWork Chip: Lego and electronics, maybe the world’s first mechanical random-signal generator.  Descriptions of many of these and other aspects of the event will appear in due course on the TTW site and participants’ blogs.


But it was a remark that Graham made as he was waiting in the ferry queue that is most telling.  It was not the doing that was central, the making, even the talking, but the fact that he didn’t have to do anything at all.  It was the lack of a plan that made space to fill with doing, or not to do so.

Is that the heart?  We need time and space for non-doing, or maybe even un-doing, unwinding tangles of self as well as wire.

There will be another Tiree Tech Wave in March/April, do come to share in some more not doing then.

Who was there:

  • Alessio Malizia – across the seas from Madrid, blurring the boundaries between information, light and space
  • Helen  Pritchard – artist, student of innovation and interested in cows
  • Claire  Andrews – roller girl and researching the design of assistive products
  • Clare  Hooper – investigating creativity, innovation and a sprinkling of SemWeb
  • Magnus  Lawrie – artist, tent-dweller and researcher of digital humanities
  • Steve Gill – designer, daredevil and (when he can get me to make time) co-authoring book on physicality TouchIT
  • Graham Dean – ex-computer science lecturer, ex-businessman, and current student and auto-ethnographer of maker-culture
  • Steve Foreshaw – builder, artist, magician and explorer of alien artefacts
  • Matjaz Kljun – researcher of personal information and olive oil maker
  • Layda Gongora – artist, curator, studying improvisation, meditation and wild hair
  • Alan Dix – me

Or … is Amazon becoming the publishing Industry?

A recent Blog Kindle post asked “Is Amazon’s Kindle Destroying the Publishing Industry?“.  The post defends Kindle seeing the traditional publishers as reactionaries, whose business model depended on paper publishing and, effectively. keeping authors from their public.

However, as an author myself (albeit academic) this seems to completely miss the reasons for the publishing industry.  The printing of physical volumes has long been a minimal part of the value, indeed traditional publishers have made good use of the changes in physical print industry to outsource actual production.  The core value for the author are the things around this: marketing, distribution and payment management.

Of these, distribution is of course much easier now with the web, whether delivering electronic copies, or physical copies via print-on demand services.  However, the other core values persist – at their best publishers do not ring fence the public from the author, but on the contrary connect the two.

I recall as a child being in the Puffin Club and receiving the monthly magazine.  I could not afford many books at the time, but since have read many of the books described in its pages and recall the excitement of reading those reviews.  A friend has a collection of the early Puffins (1-200) in their original covers; although some stories age, some are better, some worse, still just being a Puffin Book was a pretty good indication it was worth reading.

The myth we are being peddled is of a dis-intermediated networked world where customers connect directly to suppliers, authors to readers1, musicians to fans.  For me, this has some truth, I am well enough known and well enough connected to distribute effectively.  However for most that ‘direct’ connection is mediated by one of a small number of global sites … and smaller number of companies: YouTube, Twitter, Google, iTunes, eBay, not to forget Amazon.

For publishing as in other areas, what matters is not physical production, the paper, but the route, the connection, the channel.

And crucially Kindle is not just the device, but the channel.

The issue is not whether Kindle kills the publishing industry, but whether Amazon becomes the publishing industry.  Furthermore, if Amazon’s standard markdown and distribution deals for small publishers are anything to go by, Amazon is hardly going to be a cuddly home for future authors.

To some extent this is an apparently inexorable path that has happened in the traditional industries, with a few large publishing conglomerates buying up the smaller publishing houses, and on the high street a few large bookstore chains such as Waterstones, Barnes & Noble squeezing out the small bookshops (remember “You’ve Got Mail“), and it is hard to have sympathy with Waterstones recent financial problems given this history.

Philip Jones of the Bookseller recently blogged about these changes, noting that it is in fact book selling, not publishing that is struggling with profits … even Amazon – no wonder Amazon want more of the publishing action.  However, while Jones notes that the “digital will lead to smaller book chains, stocking fewer titles” in fact “It wasn’t digital that drove this, but it is about to deliver the coup de grâce.”

Which does seem a depressing vision both as author and reader.

  1. Maybe unbound.co.uk is actually doing this – see Guardian article, although it sounds more useful to the already successful writer than the new author.[back]

the real tragedy of the commons

I’ve just been reviewing a paper that mentions the “tragedy of the commons”1  and whenever I read or hear the phrase I feel the hackles on the back of my neck rise.

Of course the real tragedy of the commons was not free-riding and depletion by common use, but the rape of the land under mass eviction or enclosure movements when they ceased to be commons.  The real tragedy of “the tragedy of the commons” as a catch phrase is that it is often used to promote the very same practices of centralisation.  Where common land has survived today, just as in the time before enclosures and clearances, it is still managed in a collaborative way both for the people now and the for the sake of future generations.  Indeed on Tiree, where I live, there are large tracts of common grazing land managed in just such a way.

It is good to see that the Wikipedia article of “Tragedy of the Commons” does give a rounded view on the topic including reference to an historical and political critique by “Ian Angus”2

The paper I was reading was not alone in uncritically using the phrase.  Indeed in “A Framework for Web Science”3 we read:

In a decentralised and growing Web, where there are no “owners” as such, can we be sure that decisions that make sense for an individual do not damage the interests of users as a whole? Such a situation, known as the ‘tragedy of the commons’, happens in many social systems that eschew property rights and centralised institutions once the number of users becomes too large to coordinate using peer pressure and moral principles.

In fact I do have some sympathy with this as the web involves a vast number of physically dispersed users who are perhaps “too large to coordinate using peer pressure and moral principles”.  However, what is strange is that the web has raised so many modern counter examples to the tragedy of the commons, not least Wikipedia itself.  In many open source projects people work as effectively a form of gift economy, where, if there is any reward, it is in the form of community or individual respect.

Clearly, there are examples in the world today where many individual decisions (often for short term gain) lead to larger scale collective loss.  This is most clearly evident in the environment, but also the recent banking crisis, which was fuelled by the desire for large mortgages and general debt-led lives.  However, these are exactly the opposite of the values surrounding traditional common goods.

It may be that the problem is not so much that large numbers of people dilute social and moral pressure, but that the impact of our actions becomes too diffuse to be able to appreciate when we make our individual life choices.  The counter-culture of many parts of the web may reflect, in part, the way in which aspects of the web can make the impact of small individual actions more clear to the individual and more accountable to others.

  1. Garrett Hardin, “The Tragedy of the Commons”, Science, Vol. 162, No. 3859 (December 13, 1968), pp. 1243-1248. … and here is the danger of citation counting as a quality metric, I am citing it because I disagree with it![back]
  2. Ian Angus. The Myth of the Tragedy of the Commons. Socialist Voice, August 24, 2008[back]
  3. Berners-Lee, T., Hall, W., Hendler, J. A., O’Hara, K., Shadbolt, N. and Weitzner, D. J. (2006) A Framework for Web Science. Foundations and Trends in Web Science, 1 (1). pp. 1-130.  http://eprints.ecs.soton.ac.uk/13347/[back]

announcing Tiree Tech Wave!

Ever since I came to Tiree I’ve had a vision of bringing people here, to share some of the atmosphere and work together.  A few of you have come on research visits and we have had some really productive times.  Others have said they wished they could come sometime.

Well now is your chance …

Come to Tiree Tech Wave in March to make, talk and play at the wind-ripping edge of digital technology.

seascape

Every year Tiree hosts the Wave Classic, a key international wind surfing event.  Those of us at the edge of the digital wave do not risk cold seas and bodily injury, but there is something of the same thrill as we explore the limits of code, circuit boards and social computation.

iconsThe cutting edge of wind-surfing boards is now high technology, but typically made by artisan craftsfolk, themselves often surfers.  Similarly hardware platforms such as Arduino, mobile apps for iPhone and Android, and web mashups enabled by public APIs and linked data are all enabling a new maker culture, challenging the hegemony of global corporations.

artworkThe Western Celtic fringes were one of the oases of knowledge and learning during the ‘dark ages’.  There is something about the empty horizon that helped the hermit to focus on God and inspired a flowering of decorative book-making, even in the face of battering storms of winter and Viking attacks of summer; a starkness that gave scholars time to think in peace between danger-fraught travel to other centres of learning across Europe.

Nowadays regular Flybe flights and Calmac ferries reduce the risk of Viking attacks whilst travelling to the isles, broadband Internet and satellite TV invade the hermit cell, and double glazing and central heating mollify the elements.  Yet there is still a rawness that helps focus the mind, a slightly more tenuous connection to the global infrastructure that fosters a spirit of self-reliance and independence.

LEDsOver a long weekend 17 – 21 March (TBC), we plan what I hope will be a semi-regular event.  A time to step out, albeit momentarily, from a target-driven world, to experiment and play with hardware and software, to discuss the issues of our new digital maker culture, what we know and what we seek to understand, and above all to make things together.

This is all about technology and people: the physical device that sits in our hands, the data.gov.uk mashup that tells us about local crime, the new challenges to personal privacy and society and the nation state.

Bring your soldering iron, and Arduino boards, your laptop and API specs, your half-written theses and semi-formed ideas, your favourite book or even well-loved eReader (!).  The format will be informal, with lots of time to work hands-on together; however, there will be the opportunity for short talks/demos/how-to-do-it sessions.  Also, if there is demand, I’d  be happy to do some more semi-formal tutorial sessions and maybe others would too (Arduino making, linked data).

Currently we have no idea whether there will be three or three hundred people interested, but aiming for something like 15 – 30 participants.  We’ll keep costs down, probably around £70 for meeting rooms, lunches, etc. over the five days, but will confirm that and more details shortly.

Follow on Twitter at @tireetechwave and the website will be at tireetechwave.com. However, it is still ‘under development’, so don’t be surprised at the odd glich over the next couple of weeks as we sort out details.

If you are interested in coming or want to know more mail me or Graham Dean

Web Art/Science Camp — how web killed the hypertext star and other stories

Had a great day on Saturday at the at the Web Art/Science Camp (twitter: #webartsci , lanyrd: web-art-science-camp). It was the first event that I went to primarily with my Talis hat on and first Web Science event, so very pleased that Clare Hooper told me about it during the DESIRE Summer School.

The event started on Friday night with a lovely meal in the restaurant at the British Museum. The museum was partially closed in the evening, but in the open galleries Rosetta Stone, Elgin Marbles and a couple of enormous totem poles all very impressive. … and I notice the BM’s website when it describes the Parthenon Sculptures does not waste the opportunity to tell us why they should not be returned to Greece!

Treasury of Atreus

I was fascinated too by images of the “Treasury of Atreus” (which is actually a Greek tomb and also known as the Tomb of Agamemnon. The tomb has a corbelled arch (triangular stepped stones, as visible in the photo) in order to relieve load on the lintel. However, whilst the corbelled arch was an important technological innovation, the aesthetics of the time meant they covered up the triangular opening with thin slabs of fascia stone and made it look as though lintel was actually supporting the wall above — rather like modern concrete buildings with decorative classical columns.

how web killed the hypertext star

On Saturday, the camp proper started with Paul de Bra from TU/e giving a sort of retrospective on pre-web hypertext research and whether there is any need for hypertext research anymore. The talk brought out several of the issues that have worried me also for some time; so many of the lessons of the early hypertext lost in the web1.

For me one of the most significant issues is external linkage. HTML embeds links in the document using <a> anchor tags, so that only the links that the author has thought of can be present (and only one link per anchor). In contrast, mature pre-web hypertext systems, such as Microcosm2, specified links eternally to the document, so that third parties could add annotation and links. I had a few great chats about this with one of the Southampton Web Science DTC students; in particular, about whether Google or Wikipedia effectively provide all the external links one needs.

Paul’s brief history of hypertext started, predictably, with Vannevar Bush‘s  “As We May Think” and Memex; however he pointed out that Bush’s vision was based on associative connections (like the human mind) and trails (a form of narrative), not pairwise hypertext links. The latter reminded me of Nick Hammond’s bus tour metaphor for guided educational hypertext in the 1980s — occasionally since I have seen things a little like this, and indeed narrative was an issue that arose in different guises throughout the day.

While Bush’s trails are at least related to the links of later hypertext and the web, the idea of associative connections seem to have been virtually forgotten.  More recently in the web however, IR (information retrieval) based approaches for page suggestions like Alexa and content-based social networking have elements of associative linking as does the use of spreading activation in web contexts3

It was of course Nelson who coined the term hypertext, but Paul reminded us that Ted Nelson’s vision of hypertext in Xanadu is far richer than the current web.  As well as external linkage (and indeed more complex forms in his ZigZag structures, a form of faceted navigation.), Xanadu’s linking was often in the form of transclusions pieces of one document appearing, quoted, in another. Nelson was particularly keen on having only one copy of anything, hence the transclusion is not so much a copy as a reference to a portion. The idea of having exactly one copy seems a bit of computing obsession, and in non-technical writing it is common to have quotations that are in some way edited (elision, emphasis), but the core thing to me seems to be the fact that the target of a link as well as the source need not be the whole document, but some fragment.

Paul de Bra's keynote at Web Art/Science Camp (photo Clare Hooper)

Over a period 30 years hypertext developed and started to mature … until in the early 1990s came the web and so much of hypertext died with its birth … I guess a bit like the way Java all but stiltified programming languages. Paul had a lovely list of bad things about the web compared with (1990s) state of the art hypertext:

Key properties/limitations in the basic Web:

  1. uni-directional links between single nodes
  2. links are not objects (have no properties of their own)
  3. links are hardwired to their source anchor
  4. only pre-authored link destinations are possible
  5. monolithic browser
  6. static content, limited dynamic content through CGI
  7. links can break
  8. no transclusion of text, only of images

Note that 1, 3 and 4 are all connected with the way that HTML embeds links in pages rather than adopting some form of external linkage. However, 2 is also interesting; the fact that links are not ‘first class objects’. This has been preserved in the semantic web where an RDF triple is not itself easily referenced (except by complex ‘reification’) and so it is hard to add information about relationships such as provenance.

Of course, this same simplicity (or even that it was simplistic) that reduced the expressivity of HTML compared with earlier hypertext is also the reasons for its success compared with earlier more heavy weight and usually centralised solutions.

However, Paul went on to describe how many of the features that were lost have re-emerged in plugins, server enhancements (this made me think of systems such as zLinks, which start to add an element of external linkage). I wasn’t totally convinced as these features are still largely in research prototypes and not entered the mainstream, but it made a good end to the story!

demos and documentation

There was a demo session as well as some short demos as part of talks. Lots’s of interesting ideas. One that particularly caught my eye (although not incredibly webby) was Ana Nelson‘s documentation generator “dexy” (not to be confused with doxygen, another documentation generator). Dexy allows you to include code and output, including screen shots, in documentation (LaTeX, HTML, even Word if you work a little) and live updates the documentation as the code updates (at least updates the code and output, you need to change the words!). It seems to be both a test harness and multi-version documentation compiler all in one!

I recall that many years ago, while he was still at York, Harold Thimbleby was doing something a little similar when he was working on his C version of Knuth’s WEB literate programming system. Ana’s system is language neutral and takes advantage of recent developments, in particular the use of VMs to be able to test install scripts and to be sure to run code in a consistent environments. Also it can use browser automation for web docs — very cool 🙂

Relating back to Paul’s keynote this is exactly an example of Nelson’s transclusion — the code and outputs included in the document but still tied to their original source.

And on this same theme I demoed Snip!t as an example of both:

  1. attempting to bookmark parts of web pages, a form of transclusion
  2. using data detectors a form of external linkage

Another talk/demo also showed how Compendium could be used to annotate video (in the talk regarding fashion design) and build rationale around … yet another example of external linkage in action.

… and when looking after the event at some of Weigang Wang‘s work on collaborative hypermedia it was pleasing to see that it uses a theoretical framework for shared understanding in collaboratuve hypermedia that builds upon my own CSCW framework from the early 1990s 🙂

sessions: narrative, creativity and the absurd

Impossible to capture in a few words, but one session included different talks and discussion about the relation of narrative and various forms of web experiences — including a talk on the cognitive psychology of the Kafkaesque. Also discussion of creativity with Nathan live recording in IBIS!

what is web science

I guess inevitably in a new area there was some discussion about “what is web science” and even “is web science a discipline”. I recall similar discussions about the nature of HCI 25 years ago and not entirely resolved today … and, as an artist who was there reminded us, they still struggle with “what is art?”!

Whether or not there is a well defined discipline of ‘web science’, the web definitely throws up new issues for many disciplines including new challenges for computing in terms of scale, and new opportunities for the social sciences in terms of intrinsically documented social interactions. One of the themes that recurred to distinguish web science from simply web technology is the human element — joy to my ears of course as a HCI man, but I think maybe not the whole story.

Certainly the gathering of people from different backgrounds in a sort of disciplinary bohemia is exciting whether or not it has a definition.

  1. see also “Names, URIs and why the web discards 50 years of computing experience“[back]
  2. Wendy Hall, Hugh Davis and Gerard Hutchings, “Rethinking Hypermedia:: The Microcosm Approach, Springer, 1996.[back]
  3. Spreading activation is used by a number of people, some of my own work with others at Athens, Rome and Talis is reported in “Ontologies and the Brain: Using Spreading Activation through Ontologies to Support Personal Interaction” and “Spreading Activation Over Ontology-Based Resources: From Personal Context To Web Scale Reasoning“.[back]

UK internet far from ubiquitous

On the last page of the Guardian on Saturday (13th Oct) in a sort of ‘interesting numbers’ section, they say that:

“30% of the UK population have no internet access at home”

I couldn’t find the exact source of this, however, another  guardian article “UK internet audience rises by 1.9 million over last year” dated Wednesday 30 June 2010 has a similar figure.  This says that Internet use  has grown to 38.8 million. The National Statistics office say the overall UK population is 61,792,000 with 1/5 under 16, so call that 2 in 16 under 10 or around 8 million. That gives an overall population of a little under 54 million over 10 years old, that is still only 70% actually using the web at all.

My guess is that some of the people with internet at home do not use it, and some of the ones without home connections use it using other means (mobile, use at school, cyber cafe’s), but by both measures we are hardly a society where the web is as ubiquitous as one might have imagined.

French subvert democatic process to pass draconian internet laws

Just saw on Rob @ dynamicorange, that the French have passed a law forcing ISPs to withdraw access based on accusations of IP infringement. Whether one agrees or disagrees  or even understands the issues involved, it appear this was forced through by a vote of 16 (out of 577) members of the French parliament at a time when the vote was not expected.  This reminds me of the notorious Shetland Times case back in the late 1990s, where the judgement  implied that simply, linking to another site infringed copyright and caused some sites to stop interlinking for fear of prosecution1, not to mention some early US patents that were granted because patent officers simply did not understand the technology and its implications2.

It would be nice to think that the UK had learnt from the Shetland case, but sadly not.  Earler this year the Government released its interim Digital Britain report. This starts well declaring “The success of our manufacturing and services industries will increasingly be defined by their ability to use and develop digital technologies“; however the sum total of its action plan to promote ‘Digital Content’ is to strengthen IP protection.  Whatever one’s views on copyright, file sharing etc., the fact that a digital economy is a global economy seems to have somehow been missed on the way; and this is the UK’s “action plan to secure the UK’s place at the forefront of innovation, investment and quality in the digital and communications industries3.

  1. See “Copyright battles: The Shetlands” @ Ariadne and “Scottish Court Orders Online Newspaper to Remove Links to Competitor’s Web Site” @ Harvard’s Berkman Center for Internet & Society.[back]
  2. and for that matter, more recent cases like the ‘wish list’ patent[back]
  3. UK Department for Culture, Media and Sport Press Release 106/08 “Digital Britain – the future of communications” 17th October 2008[back]

From raw experience to personal reflection

Just a week to go for deadline for this workshop on the Designing for Reflection on Experience that Corina and I are organising at CHI. Much of the time discussions of user experience are focused on trivia and even social networking often appears to stop at superficial levels.  While throwing a virtual banana at a friend may serve to maintain relationships and is perhaps less trivial than it at first appears; still there is little support for deeper reflection on life, with the possible exception of the many topic-focused chat groups.  However, in researching social networks we have found, amongst the flotsam, clear moments of poinency and conflict, traces of major life events … even divorce by Facebook. Too much navel gazing would not be a good thing, but some attention to expressing  deeper issues to others and to ourselves seems overdue.