CSS considered harmful (the curse of floats and other scary stories)

CSS and JavaScript based sites have undoubtedly enabled experiences far richer than the grey-backgrounded days of the early web in the 1990s (and recall, the backgrounds really were grey!). However, the power and flexibility of CSS, in particular the use of floats, has led to a whole new set of usability problems on what appear to be beautifully designed sites.

I was reading a quite disturbing article on a misogynistic Dell event by Sophie Catherina Løhr at elektronista.dk.  However, I was finding it frustrating as a line of media icons on the left of the page meant only the top few lines were unobstructed.

I was clearly not the only one with this problem as one of the comments on the page read:

That social media widget on the left made me stop reading an otherwise interesting article. Very irritating.

To be fair on the page designer,  it was just on Firefox that the page rendered like this, on other browsers the left-hand page margin was wider.  Probably Firefox is strictly ‘right’ in a sense as it sticks very close to standards, but whoever is to blame, it is not helpful to readers of the blog.

For those wishing to make cross-browser styles, it is usually possible now-a-days, but you often have to reset everything at the beginning of your style files — even if CSS is standard, default styles are not:

body {
    margin: 0;
    padding 0;
    /*  etc. */
}

Sadly this is just one example of an increasingly common problem.

A short while ago I was on a site that had a large right-hand side tab.  I forget the function, maybe for comments, or a table of contents.  The problem was the tab obscured and prevented access to most of the scroll bar making navigation of the middle portion of the page virtually impossible.  Normally it is not possible to obscure the scroll bar as it is ‘outside’ the page. However this site, like many, had chosen to put the main content of the site in a fixed size scrolling <div>.  This meant that the header and footer were always visible, and the content scrolled in the middle.  Of course the scroll bar of the <div> is then part of the page and could be obscured.  I assume it was another cross-browser formatting difference that meant the designer did not notice the problem, or perhaps (not unlikely), only ever tested the style of pages with small amounts of non-scrolling text.

Some sites adopt a different strategy for providing fixed headers.  Rather than putting the main content in a fixed <div>, instead the header and footer are set to float above the main content and margins added to it to mean that the page renders correctly at top and bottom.  This means that the scrollbar for the content is the main scroll bar, and therefore cannot be hidden or otherwise mangled 🙂

Unfortunately, the web page search function does not ‘know’ about these floating elements and so when you type in a search term, will happily scroll the page to ”reveal’ the searched for word, but may do so in a way that it is underneath either header or footer and so invisible.

This is not made easier to deal with in the new MacOS Lion were the line up/down scroll arrows have been removed.  Not only can you not fine-adjust the page to reveal those hidden searched-for terms, but also, whilst reading the page, the page-up/page-down scroll does not ‘know’ about the hidden parts and so scrolls a full screen-sized page missing half the text 🙁

Visibility problems are not confined to the web, there has been a long history of modal dialogue boxes being lost behind other windows (which then often refuse to interact due to the modal dialogue box), windows happily resizing themselves to be partly obscured by the Apple Dock, or even disappearing onto non-existent secondary displays.

It may be that some better model of visibility could be built into both CSS/DOM/JavaScript and desktop window managers.  And it may even be that CSS will fix it’s slightly odd model of floats and layout.  However, I would not want to discourage the use of overlays, transparencies, and other floating elements until this happens.

In the mean time, some thoughts:

  1. restraint — Recall the early days of DTP when every newsletter sported 20 fonts. No self respecting designer would do this now-a-days, so use floats, lightboxes and the like with consideration … and if you must have popups or tabs that open on hover rather than when clicked, do make sure it is possible to move your mouse across the page without it feeling like walking a minefield.
  2. resizing — Do check your page with different window sizes, although desktop screens are now almost all at least 1024 x 768, think laptops and pads, as this is increasingly the major form of access.
  3. defaults — Be aware that, W3C not withstanding, browsers are different.  At very minimum reset all the margins and padding as a first step, so that you are not relying on browser defaults.
  4. testing — Do test (and here I mean technical testing, do user test as well!) with realistic pages, not just a paragraph of lorem ipsum.

And do my sites do this well … ?

With CSS as in all things, with great power …

P.S. Computer scientists will recognise the pun on Dijkstra’s “go to statement considered harmful“, the manifesto of structured programming.  The use of gotos in early programming langauges was incredibly flexible and powerful, but just like CSS with many concomitant potential dangers for the careless or unwary.  Strangely computer scientists have had little worry about other equally powerful yet dangerous techniques, not least macro languages (anyone for a spot of TeX debugging?), and Scheme programmers throw around continuations as if they were tennis balls.  It seemed as though the humble goto became the scapegoat for a discipline’s sins. It was interesting when the goto statement was introduced as a ‘new’ feature in PHP5.3, an otherwise post-goto C-style language; very retro.


image  xkcd.com

The value of networks: mining and building

The value of networks or graphs underlies many of the internet (and for that read global corporate) giants.  Two of the biggest: Google and Facebook harness this in very different ways — mining and building.

Years ago, when I was part of the dot.com startup aQtive, we found there was no effective understanding of internet marketing, and so had to create our own.  Part of this we called ‘market ecology‘.  This basically involved mapping out the relationships of influence between different kinds of people within some domain, and then designing families of products that exploited that structure.

The networks we were looking at were about human relationships: for example teachers who teach children, who have other children as friends and siblings, and who go home to parents.  Effectively we were into (too) early social networking1!

The first element of this was about mining — exploiting the existing network of relationships.

However in our early white papers on the topic, we also noted that the power of internet products was that it was also possible to create new relationships, for example, adding ‘share’ links.  That is building the graph.

The two are not distinct, if one is not able to exploit new relationships within a product it will die, and the mining of existing networks can establish new links (e.g. Twitter suggesting who to follow).  Furthermore, creating of links is rarely ex nihilo, an email ‘share’ link uses an existing relationships (contact in address book), but brings it into a potentially different domain (e.g. bookmarking a web page).

It is interesting to see Google and Facebook against this backdrop.  Their core strengths are in different domains (web information and social relationships), but moreover they focus differently on mining and building.

Google is, par excellence, about mining graphs (the web).  While it has been augmented and modified over the years, the link structure used in PageRank is what made Google great.  Google also mine tacit relationships, for example the use of word collocation to understand concepts and relationships, so in a sense build from what they mine.

Facebook’s power, in contrast, is in the way it is building the social graph as hundreds of millions of people tell it about their own social relationships.  As noted, this is not ex nihilo, the social relationships exist in the real word, but Facebook captures them digitally.  Of course, then Facebook mines this graph in order to derive revenue form advertisements, and (although people debate this) attempt to improve the user experience by ranking posts.

Perhaps the greatest power comes in marrying the two.   Amazon does this to great effect within the world of books and products.

As well as a long-standing academic interest, these issues are particularly germane to my research at Talis where the Education Graph is a core element.  However, they apply equally whether the core network is kite surfers, chess or bio-technology.

Between the two it is probably building that is ultimately most critical.  When one has a graph or network it is possible to find ways to exploit it, but without the network there is nothing to mine. Page and Brin knew this in the early days of their pre-Google project at Stanford, and a major effort was focused on simply gathering the crawl of the web on which they built their algorithms2.  Now Google is aware that, in principle, others can exploit the open resources on which much of its business depends; its strength lies in its intellectual capital. In contrast, with a few geographical exceptions, Facebook is the social graph, far more defensible as Google has discovered as it struggles with Google Plus.

  1. See our retrospective about vfridge  at  last year’s HCI conference and our original web sharer vision.[back]
  2. See the description of this in “In The Plex: How Google Thinks, Works and Shapes Our Lives“.[back]

books: The Nature of Technology (Arthur) and The Evolution of Technology (Basalla)

I have just finished reading “The Nature of Technology” (NoT) by W. Brian Arthur and some time ago read “The Evolution of Technology” (EoT) by George Basalla, both covering a similar topic, the way technology has developed from the earliest technology (stone axes and the wheel), to current digital technology.  Indeed, I’m sure Arthur would have liked to call his book “The Evolution of Technology” if Basalla had not already taken that title!

We all live in a world dominated by technology and so the issue of how technology develops is critical to us all.   Does technology ultimately serve human needs or does it have its own dynamics independent of us except maybe as cogs in its wheels?  Is the arc of technology inevitable or does human creativity and invention drive it in new directions? Is the development of technology now similar (albeit a bit faster) than previous generations, or does digital technology fundamentally alter things?

Basalla was  published in 1988, while Arthur is 2009, so Arthur has 20 years more to work on, not much compared to 2 million years for the stone axe and 5000 years for the wheel, but 20 years that has included the dot.com boom (and bust!), and the growth of the internet.  In a footnote (NoT,p.17), Arthur describes Basalla as “the most complete theory to date“, although then does not appear to directly reference Basalla again in the text – maybe because they have different styles.  Basalla (a historian of technology) offering a more descriptive narrative  whilst Arthur (and engineer and economist) seeks a more analytically complete account. However I also suspect that Arthur discovered Basella’s work late and included a ‘token’ reference; he says that a “theory of technology — an “ology” of technology” is missing (NoT,p.14), but, however partial, Basella’s account cannot be seen as other than part of such a theory.

Both authors draw heavily, both explicitly and implicitly, on Darwinian analogies, but both also emphasise the differences between biological and technological evolution. Neither is happy with, what Basella calls the “heroic theory of invention” where “inventions emerge in a fully developed state from the minds of gifted inventors” (EoT,p.20).  In both there are numerous case studies which refute these more ‘heroic’ accounts, for example Watts’ invention of the steam engine after seeing a kettle lid rattling on the fire, and show how these are always built on earlier technologies and knowledge.  Arthur is more complete in eschewing explanations that depend on human ingenuity, and therein, to my mind, lies the weakness of his account.  However, Arthur does take into account, as central mechanism,  the accretion of technological complexity through the assembly of components, al but absent from Basella’s account — indeed in my notes as I read Basella I wrote “B is focused on components in isolation, forgets implication of combinations“.

I’ll describe the main arguments of each book, then look at what a more complete picture might look like.

(Note, very long post!)

Basella: the evolution of technology

Basella’s describes his theory of technological evolution in terms of  four concepts:

  1. diversity of artefacts — acknowledging the wide variety both of different kinds of things, but also variations of the same thing — one example, dear to my heart, is his images of different kinds of hammers 🙂
  2. continuity of development — new artefacts are based on existing artefacts with small variations, there is rarely sudden change
  3. novelty — introduced by people and influenced by a wide variety of psychological, social and economic factors … not least playfulness!
  4. selection — winnowing out the less useful/efficient artefacts, and again influenced by a wide variety of human and technological factors

Basella sets himself apart both from earlier historians of technology (Gilfillan and Ogburn) who took an entirely continuous view of development and also the “myths of the heroic inventors” which saw technological change as dominated by discontinuous change.

He is a historian and his accounts of the development of artefacts are detailed and beautifully crafted.  He takes great efforts to show how standard stories of heric invention, such as the steam engine, can be seen much more sensibly in terms of slower evolution.  In the case of steam, the basic principles had given rise to Newcomen’s  steam pump some 60 years prior to Watt’s first steam engine.  However, whilst each of these stories emphasised the role of continuity, as I read them I was struck also by the role of human ingenuity.  If Newcomen’s engine had been around since 1712 years, what made the development to a new and far more successful form take 60 years to develop? The answer is surely the ingenuity of James Watt.  Newton said he saw further only because he stood on the shoulders of giants, and yet is no less a genius for that.  Similaly the tales of invention seem to be both ones of continuity, but also often enabled by insights.

In fact, Basella does take this human role on board, building on Usher’s earlier work, which paced insight centrally in accounts of continuous change.  This is particularly central in his account of the origins of novelty where he considers a rich set of factors that influence the creation of true novelty.  This includes both individual factors such as playfulness and fantasy, and also social/cultural factors such as migration and the patent system.  It is interesting however that when he turns to selection, it is lumpen factors that are dominant: economic, military, social and cultural.  This brings to mind Margaret Bowden’s H-creativity and also Csikszentmihalyi’s cultural views of creativity — basically something is only truly creative (or maybe innovative) when it is recognised as such by society (discuss!).

Arthur: the nature of technology

Basella ends his book confessing that he is not happy with the account of novelty as provided from historical, psychological and social perspectives.  Arthur’s single reference to Basella (endnote, NoT, p.17) picks up precisely this gap, quoting Basella’s “inability to account fully for the emergence of novel artefacts” (EoT,p.210).  Arthur seeks to fill this gap in previous work by focusing on the way artefacts are made of components, novelty arising through the hierarchical organisation and reorganisation of these components, ultimately built upon natural phenomena.  In language reminiscent of proponents of ‘computational thinking‘, Arthur talks of a technology being the “programming of phenomena for our purposes” (NoT,p.51). Although, not directly on this point, I should particularly liked Arthur’s quotation from Charles Babbage “I wish to God this calculation had been executed by steam” (NoT,p.74), but did wonder whether Arthur’s computational analogy for technology was as constrained by the current digital perspective as Babbage’s was by the age of steam.

Although I’m not entirely convinced at the completeness of hierarchical composition as an explanation, it is certainly a powerful mechanism.  Indeed Arthur views this ‘combinatorial evolution’ as the key difference between biological and technological evolution. This assertion of the importance of components is supported by computer simulation studies as well as historical analysis. However, this is not the only key insight in Arthur’s work.

Arthur emphasises the role of what he calls ‘domains’, in his words a “constellation of technologies” forming a “mutually supporting set” (NoT,p.71).  These are clusters of technologies/ideas/knowledge that share some common principle, such as ‘radio electronics’ or ‘steam power’.  The importance of these are such that he asserts that “design in engineering begins by choosing a domain” and that the “domain forms a language” within which a particular design is an ‘utterance’.  However, domains themselves evolve, spawned from existing domains or natural phenomena, maturing, and sometimes dying away (like steam power).

The mutual dependence of technology can lead to these domains suddenly developing very rapidly, and this is one of the key mechanisms to which Arthur attributes more revolutionary change in technology.  Positive feedback effects are well studied in cybernetics and is one of the key mechanisms in chaos and catastrophe theory which became popularised in the late 1970s.  However, Arthur is rare in fully appreciating the potential for these effects to give rise to sudden and apparently random changes.  It is often assumed that evolutionary mechanisms give rise to ‘optimal’ or well-fitted results.  In other areas too, you see what I have called the ‘fallacy of optimality’1; for example, in cognitive psychology it is often assumed that given sufficient practice people will learn to do things ‘optimally’ in terms of mental and physical effort.

human creativity and ingenuity

Arthur’s account is clearly more advanced than the early more gradualists, but I feel that in pursuing the evolution of technology based on its own internal dynamics, he underplays the human element of the story.   Arthur even goes so far as to describe technology using Maturna’s term autopoetic (NoT,p.170) — something that is self-(re)producing, self-sustaining … indeed, in some sense with a life of its own.

However, he struggles with the implications of this.  If, technology responds to “its own needs” rather than human needs, “instead of fitting itself to the world, fits the world to itself” (NoT,p.214), does that mean we live with, or even within, a Frankenstein’s monster, that cares as little for the individuals of humanity as we do for our individual shedding skin cells?  Because of positive feedback effects, technology is not deterministic; however, it is rudderless, cutting its own wake, not ours.

In fact, Arthur ends his book on a positive note:

Where technology separates us from these (challenge, meaning, purpose, nature) it brings a type of death. But where it affirms these, it affirms life. It affirms our humanness.” (NoT,p.216)

However, there is nothing in his argument to admit any of this hope, it is more a forlorn hope against hope.

Maybe Arthur should have ended his account at its logical end.  If we should expect nothing from technology, then maybe it is better to know it.  I recall as a ten-year old child wondering just these same things about the arc of history: do individuals matter?  Would the Third Reich have grown anyway without Hitler and Britain survived without Churchill?  Did I have any place in shaping the world in which I was to live?  Many years later as I began to read philosophy, I discovered these were questions that had been asked before, with opposing views, but no definitive empirical answer.

In fact, for technological development, just as for political development, things are probably far more mixed, and reconciling Basella and Arthur’s accounts might suggest that there is space both for Arthur’s hope and human input into technological evolution.

Recall there were two main places where Basella placed human input (individual and special/cultural): novelty and selection.

The crucial role of selection in Darwinian theory is evident in its eponymous role: “Natural Selection”.    In Darwinian accounts, this is driven by the breeding success of individuals in their niche, and certainly the internal dynamics of technology (efficiency, reliability, cost effectiveness, etc.) are one aspect of technological selection.  However, as Basella describes in greater detail, there are many human aspects to this as well from the multiple individual consumer choices within a free market to government legislation, for example regulating genome research or establishing emissions limits for cars. This suggest a relationship with technology les like that with an independently evolving wild beast and more like that of the farmer artificially selecting the best specimens.

Returning to the issue of novelty.  As I’ve noted even Basella seems to underplay human ingenuity in the stories of particular technologies, and Arthur even more so.  Arthur attempts account for “the appearance of radically novel technologies” (NoT,p.17) though composition of components.

One example of this is the ‘invention’ of the cyclotron by Ernest Lawrence (Not,p.114).  Lawrence knew of two pieces of previous work: (i) Rolf Wideröe’s idea to accelerate particles using AC current down a series of (very) long tubes, and (ii) the fact that magnetic fields can make charged particles swing round in circles.  He put the two together and thereby made the cyclotron, AC currents sending particles ever faster round a circular tube.  Lawrence’s  first cyclotron was just a few feet across; now, in CERN and elsewhere, they are many miles in diameter, but the principle is the same.

Arthur’s take-home message from this is that the cyclotron did not spring ready-formed and whole from Lawrence’s imagination, like Athena from Zeus’ head.  Instead, it was the composition of existing parts.  However, the way in which these individual concepts or components fitted together was far from obvious.  In many of the case studies the component technology or basic natural phenomena had been around and understood for many years before they were linked together.  In each case study it seems to be the vital key in putting together the disparate elements is the human one — heroic inventors after all 🙂

Some aspects of this invention not specifically linked to composition: experimentation and trial-and-error, which effectively try out things in the lab rather than in the market place; the inventor’s imagination of fresh possibilities and their likely success, effectively trail-and-error in the head; and certainly the body of knowledge (the domains in Arthur’s terms) on which the inventor can draw.

However, the focus on components and composition does offer additional understanding of how these ‘breakthroughs’ take place.  Randomly mixing components is unlikely to yield effective solutions.  Human inventors’ understanding of the existing component technologies allows them to spot potentially viable combinations and perhaps even more important their ability to analyse the problems that arise allow them to ‘fix’ the design.

In my own work in creativity I often talk about crocophants, the fact that arbitrarily putting two things together, even if each is good in its own right, is unlikely to lead to a good combination.  However, by deeply understanding each, and why they fit their respective environments, one is able to intelligently combine things to create novelty.

Darwinism and technology

Both Arthur and Basalla are looking for modified version of Darwinism to understand technological evolution.  For Arthur it is the way in which technology builds upon components with ‘combinatorial evolution’.  While pointing to examples in biology he remarks that “the creation of these larger combined structures is rarer in biological evolution — much rarer — than in technological evolution” (NoT,p.188).  Strangely, it is precisely the power of sexual reproduction over simpler mutation, that it allows the ‘construction’ and  ‘swopping’ of components; this is why artificial evolutionary algorithms often outperform simple mutation (a form of stochastic hill-climbing algorithm, itself usually better than deterministic hill climbing). However, technological component combination is not the same as biological components.

A core ‘problem’ for biological evolution is the complexity of the genotype–phenotype mapping.  Indeed in “The Selfish Gene” Dawkins attacks Lamarckism precisely on the grounds that the mapping is impossibly complex hence cannot be inverted2.  In fact, Dawkins arguments would also ‘disprove’ Darwinian natural selection as it also depends on the mapping not being too complex.  If the mapping between genotype–phenotype were as complex as Dawkins suggested, then small changes to genotypes as gene patterns would lead to arbitrary phenotypes and so fitness of parents would not be a predictor of fitness of offspring. In fact while not simple to invert (as is necessary for Lamarckian inheritance) the mapping is simple enough for natural selection to work!

One of the complexities of the genotype–phenotype mapping in biology is that the genotype (our chromosomes) is far simpler (less information) than our phenotype (body shape, abilities etc.).  Also the complexity of the production mechanism (a mothers womb) is no more complex than the final product (the baby).  In contrast for technology the genotype (plans, specifications, models, sketches), is of comparable complexity to the final product.  Furthermore the production means (factory, workshop) is often far more complex than the finished item (but not always, the skilled woodsman can make a huge variety of things using a simple machete, and there is interesting work on self-fabricating machines).

The complexity of the biological mapping is particularly problematic for the kind of combinatorial evolution that Arthur argues is so important for technological development.  In the world of technology, the schematic of a component is a component of the schematic of the whole — hierarchies of organisation are largely preserved between phenotype and geneotype.  In contrast, genes that code for finger length are also likely to affect to length, and maybe other characteristics as well.

As noted sexual reproduction does help to some extent as chromosome crossovers mean that some combinations of genes tend to be preserved through breeding, so ‘parts’ of the whole can develop and then be passed together to future generations.  If genes are on different chromosomes, this process is a bit hit-and-miss, but there is evidence that genes that code for functionally related things (and therefore good to breed together), end up close on the same chromosome, hence more likely to be passed as a unit.

In contrast, there is little hit-and-miss about technological ‘breeding’ if you want component A from machine X and component B from machine Y, you just take the relevant parts of the plans and put them together.

Of course, getting component A and component B to work together is anther matter, typically some sort of adaptation or interfacing is needed.  In biological evolution this is extremely problematic, as Arthur says “the structures of genetic evolution” mean that each step “must produce something viable” NoT,p.188).  In contrast, the ability to ‘fix’ the details composition in technology means that combinations that are initially not viable, can become so.

However, as noted at the end of the last section, this is due not just to the nature of technology, but also human ingenuity.

The crucial difference between biology and technology is human design.

technological context and infrastructure

A factor that seems to be weak or missing in both Basella and Arthur’s theories, is the role of infrastructure and general technological and environmental context3. This is highlighted by the development of the wheel.

The wheel and fire are often regarded as core human technologies, but whereas the fire is near universal (indeed predates modern humans), the wheel was only developed in some cultures.  It has long annoyed me when the fact that South American civilisations did not develop the wheel is seen as some kind of lack or failure of the civilisation.  It has always seemed evident that the wheel was not developed everywhere simply because it is not always useful.

I was wonderful therefore to read Basella’s detailed case study of the wheel (EoT,p.7–11) where he backs up what for me had always been a hunch, with hard evidence.  I was aware that the Aztecs had wheeled toys even though they never used wheels for transport. Basella quite sensibly points out that this is reasonable given the terrain and the lack of suitable draught animals. He also notes that between 300–700 AD wheels were abandoned in the Near East and North Africa — wheels are great if you have flat hard natural surfaces, or roads, but not so useful on steep broken hillsides, thick forest, or soft sandy deserts.

In some ways these combinations: wheels and roads, trains and rails, electrical goods and electricity generation can be seen as a form of domain in Arthur’s sense, a “mutually supporting set” of technologies (NoT,p.71), indeed he does talk abut the “canal world” (NoT,p82).  However, he is clearly thinking more about the component technologies that make up a new artefact, and less about the set of technologies that need to surround new technology it make it viable.

The mutual interdependence of infrastructure and related artefacts forms another positive feedback loop. In fact, in his discussion of ‘lock-in’, Arthur does talk about the importance of “surrounding structures and organisations”, as a constraint often blocking novel technology, and the way some technologies are only possible because of others (e.g. complex financial derivatives only possible because of computation).  However, the best example is Basalla’s description of the of the development of the railroad vs. canal in the American Mid-West (EoT,p.195–197).  This is often seen as simply the result of the superiority of the railway, but in the 1960s, Robert Fogel, a historian, made a detailed economic comparison and found that there was no clear financial advantage; it is just that once one began to become dominant the positive feedback effects made it the sole winner.

Arthur’s compositional approach focuses particularly on hierarchical composition, but these infrastructures often cut across components: the hydraulics in a plane, electrical system in a car, or Facebook ‘Open Graph’. And of course one of the additional complexities of biology is that we have many such infrastructure systems in our own bodies blood stream, nervous system, food and waste management.

It is interesting that the growth of the web was possible by a technological context of the existing internet and home PC sales (which initially were not about internet use, even though now this is often the major reason for buying computational devices).  However, maybe the key technological context for the modern web is the credit card, it is online payments and shopping, or the potential for them, that has financed the spectacular growth of the area. There would be no web without Berners Lee, but equally without Barclay Card.

  1. see my WebSci’11 paper for more on the ‘fallacy of optimality’[back]
  2. Why Dawkins chose to make such an attack on Lamarckism I’ve never understood, as no-one had believed in it as an explanation for nearly 100 years.  Strangely, it was very soon after “The Selfish Gene” was published that examples of Lamarckian evolution were discovered in simple organisms, and recently in higher animals, although in the latter through epigenetic (non-DNA) means.[back]
  3. Basalla does describes the importance of “environmental influences”, but is referring principally to the natural envronment.[back]

One week to the next Tech Wave

Just a week to go now before the next Tiree Tech Wave starts, although the first person is coming on Sunday and one person is going to hang on for a while after getting some surfing in.

Still plenty of room for anyone who decides to come at the last minute.

Things have been a little hectic, as having to do more of the local organisation this time, so running round the island a bit, but really looking forward to when people get here 🙂  Last two times I’ve felt a bit of tension leading up to the event as I feel responsible.  It is difficult planning an event and not having a schedule “person A giving talk at 9:30, person B at 10:45”; strangely much harder having nothing, simply trusting that good things will happen.  Hopefully this time I now have had enough experience to know that if I just hang back and resist the urge to ‘do something’, then people will start to talk together, work together, make together — I just need to have the confidence to do nothing1.

At previous TTW we have had open evenings when people from the local community have come in to see what is being done.  This time, as well as having a general welcome to people to come and see,  Jonnet from HighWire at Lancaster is going to run a community workshop on mending based on her personal and PhD work on ‘Futuremenders‘. Central to this is Jonnet’s pledge to not acquire any more clothes, ever, but instead to mend and remake. This picks up on textile themes on the island especially the ‘Rags to Riches Eco-Chic‘ fashion award and community tapestry group, but also Tech Wave themes of making, repurposing and generally taking things to pieces.   Jonnet’s work is not techno-fashion (no electroluminescent skirts, or LEDs stitched into your wooly hat), but does use social connections both physical and through the web to create mass participation, including mass panda knitting and an attempt on the world mass darning record.

For the past few weeks I have had an unusual (although I hope to become usual) period of relative stability on the island after a previous period of 8 months almost constantly on the move.  This has included some data hacking and learning HTML5 for mobile devices (hence some hacker-ish blog posts recently) I hope to finish off one mini-project during the TTW that will be particularly pertinent the weekend the clocks ‘go forward’ an hour for British Summer Time.  Will blog if I do.

I hit the road last November almost immediately the Tech Wave finished, so never got time to tidy things up.  So, before this one starts, I really should try to write a up a couple of activities from last time as I’m sure there will plenty more this time round…

  1. Strange I always give people the same advice we they take on management roles, “the brave manager does nothing”.  How rare that is.  In a university, new Vice Chancellor starts and feels he/she has to change things — new faculty structure, new committees. “In the long run, will be better”, everyone says, but I’ve always found such re-organisation is itself re-organised before we ever get to t “the long run”.[back]

book: Nightingale, Peter Dorward

Peter Dorward’s Nightingale is a truly beautiful tale, both in language and story.  Not beautiful in a pink ribbons sense, but with a harsh, sometimes almost brutal directness.  Dorward is a Scot, so perhaps the image of whisky is pertinent.  Certainly not a liquor like limoncello, strong beneath but covered over with sweetness, like aspects of the Italy Dorward portrays, but like a South-East Islay Malt, a smoke-tar flavour that almost makes you gag and yet all the richer for its lack of compromise.

Nightingale, takes us into Italy’s “Years of Lead” (Anni di piombo), the period of political terrorism from left and right that left  thousands dead, and in particular the 1980 railway station bombing in Bologna, which killed eighty five one hot holiday morning.  This is hardly an easy topic to deal with.  The jacket describes the novel as a ‘literary thriller’, but it is at heart about people: the almost comic, but bloody, naivete of political extremism, and the tenuous glory of love.

Although, the central character in the novel is Scottish, and the protagonists include a German Baader-Meinhoff acolyte and an Egyptian bartender, Italians and Italy form not just the backdrop, but permeate the pages of Nightingale. Dorward describes Italy with sensitivity and straightforwardness, and I think loves the country and the people in the same way I have come to; yet aware of the dark undercurrents that often underlie the Formica-tabled pizzeria and high fashion boutiques.

I recall a  few years ago seeing flowers around a plaque on the wall, just opposite the entrance to the University of Rome “La Sapienza” in via Salaria.  I had been visiting occasionally for several years, but not noticed the plaque before.  I was told it was to commemorate a Professor of the University, Massimo D’Antona, who had been assassinated some years earlier (1999) for serving on a government committee looking into the reform of labour law.  In the UK it sometimes seems we have lost our passion, that politics and life end up in a lassitude and compromise, that we need some of the passion of the south.  And yet, this passion comes at a cost.

I came to Nightingale through reading Andrew Greig’s At the Loch of the Green Corrie.  The central part of Greig’s semi-biographical, semi-autobiographical book is his journey to fish at the loch of the title, accompanied with two close friends, brothers, one of which was Peter. One evening, camping beside another loch, in conversation oiled with whisky drunk from camping mugs, Peter shares his early ideas for a story.  He is a GP in London at the time, and dabbling in writing, but yet to write a full novel.

I was captivated by this real story, of the man and his desires, and instantly reached for the internet to find him.  It was with so much joy that I saw he had written the novel, and was now an award-winning author (and still a doctor, but now in Scotland). Greig’s account had opened up such an intimacy with these brothers, so wonderful to see those nascent ideas, on that midge-plagued, peat-mattressed shoreline, bear fruit.

using the Public Suffix list

On a number of occasions I have wanted to decompose domain names, for example in the URL recogniser in Snip!t.  However, one problem has always been the bit at the end.  It is clear that ‘com’ and ‘ac.uk’ are the principle suffixes of ‘www.alandix.com’ and ‘www.cs.bham.ac.uk’ respectively.  However, while I know that for UK domains it is the last two components that are important (second level domains), I never knew how to work this out in general for other countries.  Happily, Mozilla and other browser vendors have an initiative called the Public Suffix List , which provides a list of just these important critical second level (and deeper level) suffixes.

I recently found I needed this again as part of my Talis research.  There is a Ruby library and a Java sourceforge project for reading the Public Suffix list, and an implementation by the DKIM Reputation project, that transforms the list into generated tables for C, PHP and Perl.  However, nothing for easily and automatically maintaining access to the list.  So I have written a small PHP class to parse, store and access the Public Suffix list. There is an example in the public suffix section of the ‘code’ pages in this blog, and it also has its own microsite including more examples, documentation and a live demo to try.

spice up boring lists of web links – add favicons using jQuery

Earlier today I was laying out lists of links to web resources, initially as simple links:

However, this looked a little boring and so thought it would be good to add each site’s favicon (the little icon it shows to the left on a web browser), and have a list like this:

  jQuery home page

  Wikipedia page on favicons

  my academic home page

The pages with the lists were being generated, and the icons could have been inserted using a server-side script, but to simplify the server-side code (for speed and maintainability) I put the fetching of favicons into a small JavaScript function using jQuery.  The page is initially written (or generated) with default images, and the script simply fills in the favicons when the page is loaded.

The list above is made by hand, but look at this example page to see the script in action.

You can use this in your own web pages and applications by simply including a few JavaScript files and adding classes to certain HTML elements.

See the favicon code page for a more detailed explanation of how it works and how to use it in your own pages.

Gordon’s example to us all

Last night I read a BBC article on Gordon Brown’s earnings since he stopped being Prime Minister a few years ago.  I felt a lump coming to my throat as I read the story.  Ex-PMs typically have lucrative post-government careers with lecture tours and the like.  Gordon Brown has similarly earned 1.4 million pounds in lecture fees and book royalties, but then given it all away.

In the run up to the General Election in 2010 I wrote how I gradually warmed to Gordon Brown during the campaign as it became increasingly clear that he was a man of true integrity.  This is another indication of that integrity, and utterly amazing to see in the modern world.

Of course he was not pretty like David Cameron or Nick Clegg, nor could he control his irritation when faced with objectionable, if popular, views.  In short, not a showman, nor a celebrity, not slick, not ‘political – just a genuinely good man.

It is sad that that is not sufficient to impress the 21st-century voter.

Dinner or tea, lunch or dinner – signs of class or the times

I was pondering the words of the old advertising jingle1:

I like a nice cup of tea in the morning,
Just to start the day you see;
And at half past eleven,
Well my idea of heaven,
Is a nice cup of tea.

I like a nice cup off tea with my dinner,
And a nice cup of tea with my tea,
And about this time of night,
What goes down a treat, you’re right,
It’s a nice cup of tea.

As well as the deep truth underlying the words, I suddenly became aware of the beginning of the second stanza: “a nice cup of tea with my dinner, and a nice cup of tea with my tea“.

I’d guess the last part of this may be confusing to a non-UK audience, or it may conjure up images of period-drama afternoon tea with cucumber sandwiches and parasols over a game of croquet.

Now the meaning of ‘dinner’ has been a matter of discussion in my household for years.

When I was a child ‘dinner’ was the light meal in the middle of the day, whereas ‘tea’ was the main meal at around 6 o’clock.

In contrast, Fiona takes a more pragmatic approach: ‘dinner’ is the main meal whether taken midday or in the evening.

My impression is that, when I was a child, this was part of a general class distinction. Posh (middle class) people ate lunch at midday, dinner in the evening, watched BBC and drank coffee. The working class ate dinner at midday, ate tea in the evening, watched ITV (the channel with adverts), and drank tea.

Weirdly in school one had ‘school dinners’ or ‘free dinners’ if on benefits, but had ‘packed lunches’.

We have sometimes discussed whether the tea/dinner distinction was more a Welsh-ism. But the advertising jingle clearly shows it was widespread2.

Now-a-days I tend to use the words rather interchangeably, and certainly happy to use ‘lunch’. Is this because I have become part of the professional classes or a general shift of language?

What do you call meals? Is it the same as when you were little? Is it still a class distinction?

  1. According to responses in AnswerBank, this was from an original 1937 song for Brook Bond ‘D’ brand … and in fact the word ‘tea’ was replaced by ‘D’ … but I obviously missed this and remember it as ‘tea’!  The original lyrics have slightly different final lines, “And when it’s time for bed, There’s a lot to be said, For a nice cup of tea“, or maybe I simply misremembered the advert.[back]
  2. even in 1937[back]

Lies vs. facts: the 26k benefits ceiling

In the UK the government is proposing a ceiling on benefits of £26,000. This sounds a large figure, indeed it is the median income, so seems reasonable that someone out of work should not receive more than the average working person. The press is, of course, polarised on the issue, as is the Church of England.

I was particularly interested in the coverage in last Wednesday’s Daily Mail, partly as this was where the former Archbishop of Canterbury chose to issue a statement about the issue, and partly because I was on a BA flight and it is one of the free newspapers! This issue of the Mail contained an article, “The hard workers who are proud not to claim”1, detailing the circumstances of three different working and tax-paying households living below or close to the proposed £26,000 limit, who can’t understand why they are working and paying taxes to support others to live on more than them.

I wondered about the truth behind these stories.  As you might imagine, the Mail’s stories were, to be generous, disingenuous, and most probably misleading, both to their readers and those they interviewed. When you work out the actual figures and facts behind the stories, things turn out rather differently then they were projected.

The issue of the proposed £26,000 benefits ceiling was particularly hot in the news after the House of Lords made radical amendments to the bill. The opposition in the Lords to proposed benefits reforms comes not just from the Labour benches, but includes some LibDems and Conservatives, and, vocally, several Church of England bishops2.

Lord Carey, the former Archbishop of Canterbury, weighed into this debate chastising his fellow bishops in the Lords, on the grounds that the weight that the national debt lays on our children is a major moral issue and the runaway benefits bill is a crucial part of controlling this.

There are of course differing views on how fast and how radically we should be attempting to cut national debt and how this should be accomplished. What is notable is that Carey chose to make this statement in the Daily Mail. My guess is he chose the Mail, rather than, say, the Times or the Telegraph (let alone the Independent or Guardian, who might have published it alongside contrary views), is that the Mail is much more a paper for ordinary Middle England folk, the ‘squeezed middle’, who feel they are paying the bulk of the taxes that fund the burgeoning benefits budget.

Whilst the ‘quality’ newspapers push their own particular viewpoint, they do follow a certain journalistic ethic, and normally within their articles you find the full facts, as they know them. Now, this is sometimes very deeply buried, to the point of disinformation, but is at least present; the careful reader can see the counter arguments through the opinion.

The Mail has no such scruples; it is unashamedly a newspaper of persuasion not information.

Given this, however much the Mail is targeting a particular demographic, Carey’s choice seemed misguided or naive.

In particular, in the same copy as Carey’s statement, there was the article describing the three households, all in tight economic circumstances, but who are working, paying tax to fund benefits, but not on benefits themselves. This is, in fact, excellent journalism, cold figures are hard to comprehend, real examples can convey the truth better than abstractions.

One household was a single woman, Rachel, living on her own; the second, Lauren and David, an engaged couple with a baby living with one of their parents; and the third, Emma and Darren, a married couple with two small twins, living in a rented house. They all had net incomes below or close to the proposed £26,000 benefits cap, and in each case the description ends with a personal statement, which expresses their frustration that, while they manage to cope on their income, why should people need £26,000 when not in work:

I don’t understand why people would need to claim more than £26,000 in benefits if I can live comfortably on this“, Rachel

It’s crazy that people say they can’t live of £26,000. People need to make sacrifices like the rest of us have.“, Lauren

“It makes us very angry that my husband works so hard and pays tax on his income, which goes to pay the benefits bills of all those people who don’t work and who receive more money than us.“, Emma

What the Mail reporters clearly failed to tell any of these families is what they would be receiving on benefits if they were suddenly made redundant and out of work.

Just to see I put each of these people’s circumstances into the government benefits calculator and a housing benefit calculator3.

Rachel, lives alone with £16,000 gross income and £13,000 net income. She describes rent (£500) and bills taking up most of her income, but leaving her with £250 a month for “recreational and leisure activities“, allowing her to “live comfortably“. If she lost her job her benefits including housing benefit to contribute to rent would total £9,774 per annum (£53.45 job seekers allowance, £19.38 council tax rebate4, £115.30 housing benefit). That is just what she describes as her basic bills with none of her recreation or leisure. I’m sure if asked whether she would be happy to live on this, her answer would be different.

Lauren and David fare worst; they have a gross salary of £33,000, with a net income of £27,560 (including child benefit and child tax credits). If they were both to lose their job, they would take home a total of £200.61 a week, around £10,500 per annum5. It was Lauren who said, “People need to make sacrifices like the rest of us have“. If the Mail reporter had explained to her that she would have to cope on 2/5 of their current take-home money would she feel the same?

It is the last family however, that does appear to highlight anomalies in the benefits system. Darren works in public transport and has a gross pay of precisely £26,000, leaving Emma and Darren with a take home pay of £21,608 (including child benefit). If Darren lost his job (or found himself unable to work as he has a medical condition) and both of them registered as job seekers (although Emma is currently looking after the children at home) then they would receive a total of £24,295 a year (just over £15,000 of this is basic benefit, the rest council tax relief6 and housing benefit), more than their current take home pay.

The reason for this disparity is that Emma and Darren do not attempt to claim benefits: “We are proud that we’re not on benefits, although sometimes it can be really hard“. In fact they would be eligible for substantial housing benefits7, which would presumably make all the difference for them and their children.

The shame of being on welfare runs deep, and, assuming Emma and Darren are Mail readers, no doubt fanned by the constant stories of welfare scroungers and the ‘feckless’. They quite rightly want to instil an ethic of hard work into their children, but do not feel able to claim benefits, which they will have contributed to through tax and national insurance throughout their previous working lives, in order to help as they bring up those same children now.

Interestingly, they are happy to accept child benefit (and I assume child tax credit, although not explicitly mentioned), and when the children are of school age will not send them to a fee-paying school, but happy to send them to a state school, effectively an educational ‘benefit’ of around £16,000 a year, let alone insist on paying for hospital and doctors fees for delivery of the twins and subsequent medical care.

The difference is that these benefits, allowances, and services are universal, and so seen as ‘rights’ as a taxpayer, even if, as in the case of this family, you are a net beneficiary.

This very much strengthens the case for maintaining child benefit as a non-means tested benefit. In general, many benefits are not claimed, whether through pride, principles or ignorance. The one exception is child benefit, which is both universally accepted and well targeted8.

Maybe if appraised of the full facts each of the people interviewed by the Mail might still feel the same, particularly Emma and Darren. Maybe too Mail readers would feel the same if presented with the truth. But clearly the Mail does not trust its readers to make up their own minds if given the full facts, and sadly Lord Carey has leant his weight behind this deliberate disinformation; unintentionally, but very persuasively helping to mislead the public.

  1. “The hard workers who are proud not to claim”, Daily Mail, Wednesday, January 25, 2012, p. 7.[back]
  2. Whether they should be in the second house in the first place is another issue![back]
  3. I used the Tonbridge & Malling Bourough Council’s web site as this has an online housing benefit calculator.  While currently housing benefit is similar across the country, this may change in the future with government plans for ‘localising support‘, the potential impact of which has been under-reported.[back]
  4. For Rachel on a one bedroom flat I estimated a council tax bill of £1000.[back]
  5. This figure is particularly low as  they live with parents.  While the government makes strong statements about family values, there are equally strong disincentives to support close family.  If Lauren and David were out of work, but with friends rather than parents, they would be able to pay rent to contribute to household costs, which they could then claim against housing benefit.  Furthermore, if a grown-up child receives cash support from parents, it is regarded as income for the calculation of benefits.[back]
  6. For Emma and Darren I estimated an annual council tax bill of £1500.[back]
  7. Housing benefit is perhaps the greatest cause of anomalies in the systems. Even Boris Johnson was against a cap in housing benefit, as the proposed, albeit apparently high, limit would still make large areas of London (not just the fancy bits!) no go areas for anyone on an average wage including nurses, transport workers, etc.. The situation gets even more complicated with those with a mortgage, as mortgage interest is deemed a cost for benefits calculation when you are out of work, but not when you have a job.[back]
  8. More broadly there is a minority suggestion (I believe only the Green Party in the UK support this) to replace all tax allowances and basic benefits, with a universal wage or ‘basic income‘, effectively an amount for every adult and child, deemed high enough for basic survival (probably close to current basic benefit levels). Indeed the amount you gain through the personal tax allowance, the amount you can earn without paying tax, is very close to a single person’s job seekers allowance, so this is very nearly a ‘zero sum’ for tax payers without children.[back]