making life easier – quick filing in visible folders

It is one of those things that has bugged me for years … and if it was right I would probably not even notice it was there – such is the nature of good design, but …  when I am saving a file from an application and I already have a folder window open, why is it not easier to select the open folder as the destination.

A scenario: I have just been writing a reference for a student and have a folder for the references open on my desktop. I select “Save As …” from the Word menu and get a file selection dialogue, but I have to navigate through my hard disk to find the folder even though I can see it right in front of me (and I have over 11000 folders, so it does get annoying).

The solution to this is easy, some sort of virtual folder at the top level of the file tree labelled “Open Folders …” that contains a list of the curently open folder windows in the finder.  Indeed for years I instinctively clicked on the ‘Desktop’ folder expecting this to contain the open windows, but of course this just refers to the various aliases and files permamently on the desktop background, not the open windows I can see in front of me.

In fact as Mac OSX is built on top of UNIX there is an easy very UNIX-ish fix (or maybe hack), the Finder could simply maintain an actual folder (probably on the desktop) called “Finder Folders” and add aliases to folders as you navigate.  Although less in the spirit of Windows, this would certainly be possible there too and of course any of the LINUX based systems.  … so OS developers out there “fix it”, it is easy.

So why is it that this is a persistent and annoying problem and has an easy fix, and yet is still there in every system I have used after 30 years of windowing systems?

First, it is annoying and persistent, but does not stop you getting things done, it is about efficiency but not a ‘bug’ … and system designers love to say, “but it can do X”, and then send flying fingers over the keyboard to show you just how.  So it gets overshadowed by bigger issues and never appears in bug lists – and even though it has annoyed me for years, no, I have never sent a bug report to Apple either.

Second it is only a problem when you have sufficient files.  This means it is unlikely to be encountered during normal user testing.  There are a class of problems like this and ‘expert slips’1, that require very long term use before they become apparent.  Rigorous user testing is not sufficient to produse usable systems. To be fair many people have a relatively small number of files and folders (often just one enormous “My Documents” folder!), but at a time when PCs ship with hundreds of giga-bytes of disk it does seem slighty odd that so much software fails either in terms of user interface (as in this case) or in terms of functionality (Spotlight is seriously challenged by my disk) when you actually use the space!

Finally, and I think the real reason, is in the implementation architecture.  For all sorts of good software engineering reasons, the functional separation between applications is very strong.  Typically the only way they ‘talk’ is through cut-and-paste or drag-and-drop, with occasional scripting for real experts. In most windowing environments the ‘application’ that lets you navigate files (Finder on the Mac, File Explorer in Windows) is just another application like all the rest.  From a system point of view, the file selection dialogue is part of the lower level toolkit and has no link to the particular application called ‘Finder’.  However, to me as a user, the Finder is special; it appears to me (and I am sure most) as ‘the computer’ and certainly part of the ‘desktop’.  Implementation architecture has a major interface effect.

But even if the Finder is ‘just another application’, the same holds for all applications.  As a user I see them all and if I have selected a font in one application why is it not easier to select the same font in another?  In the semantic web world there is an increasing move towards open data / linked data / web of data2, all about moving data out of application silos.  However, this usually refers to persistent data more like the file system of the PC … which actually is shared, at least physically, between applications; what is also needed is that some of the ephemeral state of interaction is also shared on a moment-to-moment basis.

Maybe this will emerge anyway with increasing numbers of micro-applications such as widgets … although if anything they often sit in silos as much as larger applications, just smaller silos.  In fact, I think the opposite is true, micro-applications and desktop mash-ups require us to understand better and develop just these ways to allow applications to ‘open up’, so that they can see what the user sees.

  1. see “Causing Trouble with Buttons” for how Steve Brewster and I once forced infrequent expert slips to happen often enough to be user testable[back]
  2. For example the Web of Data Practitioners Days I blogged about a couple of months back and the core vision of Talis Platform that I’m on the advisory board of.[back]

Dublin, Guiness and the future of HCI

I wrote the title of this post on 5th December just after I got back from Dublin.  I had been in Dublin for the SIGCHI Ireland Inaugural LectureHuman–Computer Interaction in the early 21st century: a stable discipline, a nascent science, and the growth of the long tail” and was going to write a bit about it (including my first flight on and off Tiree) … then thought I’d write a short synopsis of the talk … so parked this post until the synopsis was written.

One month and 8000 words later – the ‘synopsis’ sort of grew … but just finished and now it is on the web as either HTML version or PDF. Basically a sort of ‘state of the nation’ about the current state and challenges for HCI as a discipline …

And although it now fades a little I had great time in Dublin meeting, talking research, good company, good food … and yes … the odd pint of Guiness too.

reading: Computing is a Natural Science

I’ve just been re-reading Peter Denning’s article “Computing is a Natural Science1.    The basic thesis is that computation as broad concept goes way beyond digital computers and many aspects of science and life have a computational flavour; “Computing is the study of natural and artificial information processes“.  As an article this is in some ways not so controversial as computational analogies have been used in cognitive science for almost as long as there have been computers (and before that analogies with steam engines) and readers of popular science magazines can’t have missed the physicists using parallels with information and computation in cosmology.  However, Denning’s paper is to some extent a manifesto, or call to arms: “computing is not just a load of old transistor, but something to be proud of, the roots of understanding the universe”.

Particularly useful are the “principles framework for computing” that Denning has been constructing through a community discussion process.  I mentioned these before when blogging about “what is computing? The article lists the top level categories and gives examples of how each of these can be found in areas that one would not generally think of as ‘computing’.

Computation (meaning and limits of computation)
Communication
(reliable data transmission)
Coordination
(cooperation among networked entities)
Recollection
(storage and retrieval of information)
Automation
(meaning and limits of automation)
Evaluation
(performance prediction and capacity planning)
Design
(building reliable software systems)
categories from “Great Principles of Computing

Occasionally the mappings are stretched … I’m not convinced that natural fractals are about hierarchical aggregation … but they do paint a picture of ubiquitous parallels across the key areas of computation.

Denning is presenting a brief manifesto not a treatise, so examples of very different kinds tend to be presented together.  There seem to be three main kinds:

  • ubiquity of computational devices – from iPods to the internet, computation is part of day-to-day life
  • ubiquity of computation as a tool –  from physical simulations to mathematical proofs and eScience
  • ubiquity of computation as an explanatory framework – modelling physical and biological systems as if they were performing a computational function

It is the last, computation as analogy or theoretical lens that seems most critical to the argument as the others are really about computers in the traditional sense and few would argue that computers (rather than computation) are everywhere.

Rather like weak AI, one can look treat these analogies as simply that, rather like the fact that electrical flow in circuits can be compared with water flow in pipes.  So we may feel that computation may be a good way to understand a phenomena, but with no intention of saying the phenomena is fundamentally computational.

However, some things are closer to a ‘strong’ view of computation as natural science.  One example of this is the “socio-organisational Church-Turing hypothesis”, a term I often use in talks with its origins in a 1998 paper “Redefining Organisational Memory“.  The idea is that organisations are, amongst other things, information processing systems, and therefore it is reasonable to expect to see structural similarities between phenomena in organizations and those in other information processing systems such as digital computers or our brains.  Here computation is not simply an analogy or useful model, the similarity is because there is a deep rooted relationship – an organisation is not just like a computer, it is actually computational.

Apparently absent are examples of where the methods of algorithmics or software engineering are being applied in other domains; what has become known as ‘computational thinking‘. This maybe because there are two sides to computing:

  • computing (as a natural science) understanding how computers (and similar things) behave – related to issues such as emergence, interaction and communication, limits of computability
  • computing (as a design discipline) making computers (and similar things) behave as we want them to – related to issues such as hierarchical decomposition and separation of concerns

The first can be seen as about understanding complexity and the latter controlling it.  Denning is principally addressing the former, whereas computational thinking is about the latter.

Denning’s thesis could be summarised as “computation is about more than computers”.  However, that is perhaps obvious in that the early study of computation by Church and Turing was before the advent of digital computers; indeed Church was primarily addressing what could be computed by mathematicians not machines!  Certainly I was left wondering what exactly was ubiquitous: computation, mathematics, cybernetics?

Denning notes how the pursuit of cybernetics as an all embracing science ran aground as it seemed to claim too much territory (although the practical application of control theory is still live and well) … and one wonders if the same could happen for Denning’s vision.  Certainly embodied mind theorists would find more in common with cybernetics than much of computing and mathematicians are not going to give up the language of God too easily.

Given my interest in all three subjects, computation, mathematics, cybernetics, it prompts me to look for the distinctions, and overlaps, and maybe the way that they are all linked (perhaps all just part of mathematics ;-)).  Furthermore as I am also interested in understanding the nature of computation I wonder whether looking at how natural things are ‘like’ computation may be not only an application of computational understanding, but more a lens or mirror that helps us see computation itself more clearly.

  1. well when I say re-reading I think the first ‘read’ was more of a skim, great thing about being on sabbatical is I actually DO get to read things 🙂 [back]

Steve’s bin

This is Steve‘s bin that I mentioned in my last post.

Glasdon UK: Plaza® Litter Bin

Glasdon UK: Plaza® Litter Bin

Had to be drunk proof, dustman proof, and bomb proof.  Also has to be emptied without needing a key, but be difficult to open if you don’t know how (to prevent Saturday night vandalism).  To top it all had to be designed to be able to be replaced after emptying so that it self locks, and yet is made by a moulding process that means there may be up to a couple of centimetres movement from the design spec.  I am very impressed.

strength in weakness – Judo design

Steve Gill is visiting so that we can work together on a new book on physicality.  Last night, over dinner, Steve was telling us about a litter-bin lock that he once designed.  The full story linked creative design, the structural qualities of materials, and the social setting in which it was placed … a story well worth hearing, but I’ll leave that to Steve.

One of the critical things about the design was that while earlier designs used steel, his design needed to be made out of plastic.  Steel is an obvious material for a lock: strong unyielding; however the plastic lock worked because the lock and the bin around it were designed to yield, to give a little, and is so doing to absorb the shock if kicked by a drunken passer-by.

This is a sort of Judo principle of design: rather than trying to be the strongest or toughest, instead by  yielding in the right way using the strength of your opponent.

This reminded me of trees that bend in the wind and stand the toughest storms (the wind howling down the chimney maybe helps the image), whereas those that are stiffer may break.  Also old wooden pit-props that would moan and screech when they grew weak and gave slightly under the strain of rock; whereas the stronger steel replacements would stand firm and unbending until the day they catastrophically broke.

Years ago I also read about a programme to strengthen bridges as lorries got heavier.  The old arch bridges had an infill of loose rubble, so the engineers simply replaced this with concrete.  In a short time the bridges began to fall down.  When analysed more deeply  the reason become clear.  When an area of the loose infill looses strength, it gives a little, so the strain on it is relieved and the areas around take the strain instead.  However, the concrete is unyielding and instead the weakest point takes more and more strain until eventually cracks form and the bridge collapses.  Twisted ropes work on the same principle.  Although now an old book, “The New Science of Strong Materials” opened my eyes to the wonderful way many natural materials, such as bone, make use of the relative strengths, and weaknesses, of their constituents, and how this is emulated in many composite materials such as glass fibre or carbon fibre.

In contrast both software and bureaucratic procedures are more like chains – if any link breaks the whole thing fails.

Steve’s lock design shows that it is possible to use the principle of strength in weakness when using modern materials, not only in organic elements like wood, or traditional bridge design.  For software also, one of the things I often try to teach is to design for failure – to make sure things work when they go wrong.  In particular, for intelligent user interfaces the idea of appropriate intelligence – making sure that when intelligent algorithms get things wrong, the user experience does not suffer.  It is easy to want to design the cleverest algotithms, the most complex systems – to design for everything, to make it all perfect. While it is of course right to seek the best, often it is the knowledge that what we produce will not be ‘perfect’ that in fact enables us to make it better.

The Cult of Ignorance

Throughout society, media, and academia, it seems that ignorance is no longer a void to be filled, but a virtue to be lauded.  Ignorance is certainly not a ‘problem’, not something to be ashamed of, but is either an opportunity to learn or a signal that you need to seek external expertise.  However, when ignorance is seen as something not just good in itself, but almost a sign of superiority over those who do have knowledge or expertise, then surely this is a sign of a world in decadence.

Although it is something of which I’ve been aware for a long time, two things prompt to think again about this: a mailing list discussion about science in schools and a recent paper review.

The CPHC mailing list discussion was prompted by a report by the BBC on a recent EU survey on attitudes to science amongst 15-25 year olds.  The survey found that around 1/2 of Irish and British respondents felt they “lacked the skills to pursue a career in science” compared with only 10% in several eastern European countries.  The discussion was prompted not so much by the result itself but by the official government response that the UK science community needed to do more “to understand what excites and enthuses young people and will switch them on to a science future.”  While no-one disagrees with the sentiment, regarding it as ‘the problem’ disregards the fact that those countries where scientific and mathematical education is not a problem are precisely those where the educational systems are more traditional, less focused on motivation and fun!

I have blogged before about my concerns regarding basic numeracy, but that was about ‘honest ignorance’, people who should know not knowing.  However, there is a common attitude to technical subjects that makes it a matter of pride for otherwise educated people to say “I could never do maths” or “I was never good at science”, in a way that would be incongruous if it were said about reading or writing (although as we shall see below technologists do do precisely that), and often with the implication that to have been otherwise would have been somehow ‘nerdy’ and made them less well-balanced people.

Sadly this cult of ignorance extends also to academia.

A colleague of mine recently had reviews back on a paper.  One reviewer criticised the use of the term ‘capitalisation’ (which was in context referring to ‘social capital’) as to the reviewer word meant making letters upper case.  The reviewer suggested that this might be a word in the author’s native language.

At a time when the recapitalisation of banks is a major global issue, this surely feels like culpable ignorance.  Obviously the word was being used in a technical sense, but the reviewer was suggesting it was not standard English.  Of course, ‘capital’ in the financial sense dates back certainly 300 years, the verb ‘capitalise’ is part of everyday speech “let’s capitalise on our success”, and my 30 year old Oxford English Dictionary includes the following:

Capitalize 1850. …. 2. The project of capitalizing incomes 1856. Hence Capitalization.

Now I should emphasise it is not the ignorance of the reviewer I object to; I know I am ignorant of many things and ready to admit it.  The problem is that the reviewer feels confident enough in that ignorance to criticise the author for the use of the word … apparently without either (a) consulting a dictionary, or (b) while filling out the online review form bothering to Google it!

This reminded me of a review of a paper I once received that criticised my statistical language, suggesting I should use the proper statistical term ‘significance’ rather than the informal language ‘confidence’.  Now many people do not really understand the difference between significance testing (evidence of whether things are different) and confidence intervals (evidence of how different or how similar they are) – and so rarely use the latter, even though confidence intervals are a more powerful statistical tool.  However the problem here is not so much the ignorance of the reviewer (albeit that a basic awareness of statistical vocabulary would seem reasonable in a discipline with a substantial experimental side), but the fact that the reviewer felt confident enough in his/her ignorance to criticise without either consulting an elementary statistical text book or Googling “statistics confidence”.

So, let’s be proud of our skills and our knowledge, humble in accepting the limits of what we know, and confident enough in ourselves, so that we do not need to denegrate others for doing what we cannot.  Then ignorance becomes a spring board to learn more and a launching point for collaboration

web ephemera and web privacy

Yesterday I was twittering about a web page I’d visited on the BBC1 and the tweet also became my Facebook status2.  Yanni commented on it, not because of the content of the link, but because he noticed the ‘is.gd’ url was very compact.  Thinking about this has some interesting implications for privacy/security and the kind of things you might to use different url shortening schemes for, but also led me to develop an interesting time-wasting application ‘LuckyDip‘ (well if ‘develop’ is the right word as it was just 20-30 mins hacking!).

I used the ‘is.gd’ shortening because it was one of three schemes offered by twirl, the twitter client I use.  I hadn’t actually noticed that it was significantly shorter than the others or indeed tinyurl, which is what I might have thought of using without twirl’s interface.

Here is the url of this blog <http://www.alandix.com/blog/> shortened by is.gd and three other services:

snurl:   http://snurl.com/5ot5k
twurl:  http://twurl.nl/ftgrwl
tinyurl:  http://tinyurl.com/5j98ao
is.gd:  http://is.gd/7OtF

The is.gd link is small for two reasons:

  1. ‘is.gd’ is about as short as you can get with a domain name!
  2. the ‘key’ bit after the domain is only four characters as opposed to 5 (snurl) or 6 (twurl, tinyurl)

The former is just clever domain choice, hard to get something short at all, let alone short and meaningful3.

The latter however is as a result of a design choice at is.gd.  The is.gd urls are allocated sequentially, the ‘key’ bit (7OtF) is simply an encoding of the sequence number that was allocated.  In contrast tinyurl seems to do some sort of hash either of the address or maybe of a sequence number.

The side effect of this is that if you simply type in a random key (below the last allocated sequence number) for an is.gd url it will be a valid url.  In contrast, the space of tinyurl is bigger, so ‘in principle’ only about one in a hundred keys will represent real pages … now I say ‘in principle’ because experimenting with tinyurl I find every six character seqeunce I type as a key gets me to a valid page … so maybe they do some sort of ‘closest’ match.

Whatever url shortening scheme you use by their nature the shorter url will be less redundant than a full url – more ‘random’ permutations will represent meaningful items.  This is a natural result of any ‘language’, the more concise you are the less redundant the language.

At a practical level this means that if you use a shortened url, it is more likely that someone  typing in a random is.gd (or tinyurl) key will come across your page than if they just type a random url.  Occasionally I upload large files I want to share to semi-private urls, ones that are publicly available, but not linked from anywhere.  Because they are not linked they cannot be found through search engines and because urls are long it would be highly unlikely that someone typing randomly (or mistyping) would find them.

If however, I use url shortening to tell someone about it, suddenly my semi-private url becomes a little less private!

Now of course this only matters if people are randomly typing in urls … and why would they do such a thing?

Well a random url on the web is not very interesting in general, there are 100s of millions and most turn out to be poor product or hotel listing sites.  However, people are only likely to share interesting urls … so random choices of shortened urls are actually a lot more interesting than random web pages.

So, just for Yanni, I spent a quick 1/2 hour4 and made a web page/app ‘LuckyDip‘.  This randomly chooses a new page from is.gd every 20 seconds – try it!


successive pages from LuckyDip

Some of the pages are in languages I can’t read, occasionally you get a broken link, and the ones that are readable, are … well … random … but oddly compelling.  They are not the permanently interesting pages you choose to bookmark for later, but the odd page you want to send to someone … often trivia, news items, even (given is.gd is in a twitter client) the odd tweet page on the twitter site.  These are not like the top 20 sites ever, but the ephemera of the web – things that someone at some point thought worth sharing, like overhearing the odd raised voice during a conversation in a train carriage.

Some of the pages shown are map pages, including ones with addresses on … it feels odd, voyeuristic, web curtain twitching – except you don’t know the person, the reason for the address; so maybe more like sitting watching people go by in a crowded town centre, a child cries, lovers kiss, someone’s newspaper blows away in the wind … random moments from unknown lives.

In fact most things we regard as private are not private from everyone.  It is easy to see privacy like an onion skin with the inner sanctum, then those further away, and then complete strangers – the further away someone is from ‘the secret’ the more private something is.  This is certainly the classic model in military security.  However, think further and there are many things you would be perfectly happy for a complete stranger to know, but maybe not those a little closer, your work colleagues, your commercial competitors.  The onion sort of reverses, apart from those that you explicitly want to know, in fact the further out of the onion, the safer it is.  Of course this can go wrong sometimes, as Peter Mandleson found out chatting to a stranger in a taverna (see BBC blog).

So I think LuckyDip is not too great a threat to the web’s privacy … but do watch out what you share with short urls … maybe the world needs a url lengthening service too …

And as a postscript … last night I was trying out the different shortening schemes available from twirl, and accidentally hit return, which created a tweet with the ‘test’ short url in it.  Happily you can delete tweets, and so I thought I had eradicated the blunder unless any twitter followers happened to be watching at that exact moment … but I forgot that my twitter feed also goes to my Facebook status and that deleting the tweet on twitter did not remove the status, so overnight the slip was my Facebook status and at least one person noticed.

On the web nothing stays secret long, and if anything is out there, it is there for ever … and will come back to hant you someday.

  1. This is the tweet “Just saw http://is.gd/7Irv Sad state of the world is that it took me several paragraphs before I realised it was a joke.”[back]
  2. I managed to link them up some time ago, but cannot find again the link on twitter that enabled this, so would be stuck if I wanted to stop it![back]
  3. anyone out there registering Bangaldeshi domains … if ‘is’ is available!![back]
  4. yea it should ave been less, but I had to look up how to access frames in javascript, etc.[back]

Coast to coast: St Andrews to Tiree

A week ago I was in St Andrews on the east coast of Scotland delivering three lectures on “Human Computer Interaction: as it was, as it is and as it may be” as part of their distinguished lecture series and now I am in Tiree in the wild western ocean off the west coast.

I had a great time in St Andrews and was well looked after by some I knew already Ian, Gordan, John and Russell, and also met many new people. Ate good food and stayed in a lovely hotel overlooking the sea (and golf course) and full of pictures of golfers (well what do you expect in St Andrews).

For the lectures, I was told the general pattern was one lecture about the general academic area, one ‘state of the art’ and one about my own stuff … hence the three parts of the title!  Ever for cutesy titles I then called the individual lectures “Whose Computer Is It Anyway”, “The Great Escape” and “Connected, but Under Control, Big, but Brainy?”.

The first lecture was about the fact that computers are always ultimately for people (surprise surprise!) and I used Ian’s slight car accident on the evening before the lecture as a running example (sorry Ian).

The second lecture was about the way computers have escaped the office desktop and found their way into the physical world of ubiquitous computing, the digital world of the web ad into our everyday lives in out homes and increasingly the hub of our social lives too.  Matt Oppenheim did some great cartoons for this and I’m going to use them again in a few weeks when I visit Dublin to do the inaugural lecture for SIGCHI Ireland.

for 20 years the computer is chained to the office desktop (image © Matt Oppenheim)

(© Matt Oppenheim)

... now escapes: out into the world, spreading across the net, in the home, in our social lives (image © Matt Oppenheim)

(© Matt Oppenheim)

The last lecture was about intelligent internet stuff, similar to the lecture I gave at Aveiro a couple of weeks back … mentioning again the fact that the web now has the same information storage and processing capacity as a human brain1 … always makes people think … well at least it always makes ME think about what it means to be human.

… and now … in Tiree … sun, wild wind, horizontal hail, and paddling in the (rather chilly) sea at dawn

  1. see the brain and the web[back]

From Parties in Aveiro to Packing A Van in a week

Last Friday I was in Aveiro giving  keynote at ENEI, the  national congress of Portuguese informatics students.  The event was organised for and by the students themselves and I was looked after wonderfully.  João was especially great picking me up from Lisbon airport at midnight, driving me to Aveiro the next day and then on Saturday driving me to Porto airport after less than 2 hours sleep … but more on that later …

The congress itself was in Portuguese except my talk, and I  was only able to spend one day there as I needed to get back to pack, and so, by the time I met press and talked to different people, the day flew by  … in time for an evening of typical Portuguese culture of different kinds.

First dinner of roast suckling pig – prepared in the town near Aveiro where this is the traditional dish.  Those who know me know that despite all appearances to the contrary: sandals, long hair, beard; I am NOT a vegetarian 🙂  The meat itself reminded me of the rich flavour of belly pork at Sunday dinners when I was a small child; although much more delicate and without the tooth breaking thickness of the older meat.

After traditional cuisine I was given a taste of traditional student life.  This period was a weekend when first year freshmen students all over Portugal have parties … for several nights in a row.  The student organisers I was with had been up to 6am the previous night as well as organising the conference during the day.  And this night, after having eaten suckling pig at 10pm, drove me down to a location in the dockland of Aveiro, far form any homes that would be disturbed by the noise, to a tiny village of food stalls, a huge music tent … and bars run by every student club in the university.

Some of the students wore traditional academic dress of Portugal – a thick black felt cloak hanging nearly to the floor and hats – different for each University.  A group of visiting students from of Braga wore three cornered hats and looked every bit like a troupe of Dick Turpins.  The cloaks are torn around the bottom, where family and friends would tear a gash in the edge … social networking before Facebook.  The middle of the back of the cloak is reserved for the girlfriend or boyfriend to tear … but if a relationship ended you had to sew up the tear ready for the next one!

I talked with a group of students from Évora who explained their tradition of peer tutelage (I have forgotten the name of the practice).  Two older students take a new student under their wing and teach him or her the practices of the University.  The young student did not have his cloak yet as only after six weeks did he become a true freshman and entitled to the cloak.  In the mean time they would carry him home of the parties proved too much and also put him right if he did things wrong … I missed the details of this, but I’m sure this included eggs (!?).  The young student was enjoying the process and the older pair clearly took their responsibilities very seriously.  At the end of the evening they asked me to tear their cloaks, involving using my teeth in order to start the tear.  I was very honoured to be asked.  The idea of biting cloth that had been scraping ground is not something I would normally relish (!), but given the evening consisted largely of groups of students, who had been at the afternoon lecture, pressing shots of various drinks upon me, by 4am, a little dirt didn’t seem to matter so much 🙂

And this Friday, no shots of liqueur in chocolate cups, or suckling pig and Portuguese wine, but instead packing a Luton van ready to move up to Tiree for my sabbatical.  Kiel was an absolute star, coming first thing in the morning, lugging filing cabinets and freezers … and boxes some way beyond the current 25kg one-man lift limit.  I recall routinely carrying hundredweight sacks wen I was younger and Kiel spent his youth lugging rolls of cloth around a textile factory, so for both of us 25kg seemed a little wimpish … however, there is a vast difference between a 20kg box and a 30 kg one … so the health and safety people probably have it right … and we did try to keep them all below 25kg, but very hard with boxes of books … and there are many of them a ton weight of books in fact, not to mention another half ton of bookshelving.

So this morning finds me half way up the M6, tomorrow morning we’ll be on the ferry, and the next morning in  Tiree, ready for a year of hermit-like writing and working … not to mention the odd walk on the near empty two-mile beach outside the door … but that will be another story.

web of data practioner’s days

I am at the Web of Data Practitioners Days (WOD-PD 2008) in Vienna.  Mixture of talks and guided hands-on sessions.  I presented first half of session on “Using the Web of Data” this morning with focus (surprise) on the end user. Learnt loads about some of the applications out there – in fact Richard Cyganiak .  Interesting talk from a guy at the BBC about the way they are using RDF to link the currently disconnected parts of their web and also archives.  Jana Herwig from Semantic Web Company has been live blogging the event.

Being here has made me think about the different elements of SemWeb technology and how they individually contribute to the ‘vision’ of Linked Data.  The aim is to be able to link different data sources together.  For this having some form of shared/public vocabulary or ‘data definitions’ is essential as is some relatively uniform way of accessing data.  However, the implementation using RDF or use of SPARQL etc. seems to be secondary and useful for some data, but not other forms of data where tabular data may be more appropriate.  Linking these different representations  together seems far more important than specific internal representations.  So wondering whether there is a route to linked data that allows a more flexible interaction with existing data and applications as well as ‘sucking’ in this data into the SemWeb.  Can the vocabularies generated for SemWeb be used as meta information for other forms of information and can  query/access protocols be designed that leverage this, but include broader range of data types.