More or Less: will 50,000 people really die if the universities reopen?

Last Wednesday morning I had mail from a colleague to say that my paper on student bubble modelling had just been mentioned on Radio 4 ‘More or Less’ [BBC1].    This was because UCU (the University and Colleges Union) had tweeted the headline figure of 50,000 deaths from my paper “Impact of a small number of large bubbles on Covid-19 transmission within universities” [Dx1] after it had been reviewed by Jim Dickinson on Wonkhe [DW].  The issue is continuing to run: on Friday a SAGE report [SAGE] was published also highlighting the need for vigilance around University reopening and a Today interview with Dame Anne Johnson this morning [BBC2], who warned of “a ‘critical moment’ in the coronavirus pandemic, as students prepare to return to universities.

I’m very happy that these issues are being discussed widely; that is the most important thing.   Unfortunately I was never contacted by the programme before transmission, so I am writing this to fill in details and correct misunderstandings.

I should first note that the 50,000 figure was a conditional one:

without strong controls, the return to universities would cause a minimum of 50,000 deaths

The SAGE report [SAGE] avoids putting any sort of estimate on the impact.  I can understand why! Like climate change one of the clear lessons of the Covid crisis is how difficult it is to frame arguments involving  uncertainty and ranges of outcomes in ways that allow meaningful discussion but also avoid ‘Swiss cheese’ counter-arguments that seek the one set of options that all together might give rise to a wildly unlikely outcome.  Elsewhere I’ve written about some of the psychological reasons and human biases that make it hard to think clearly about such issues [Dx2].

The figure of 50,000 deaths at first appears sensationalist, but in fact the reason I used this as a headline figure was precisely because it was on the lower end of many scenarios where attempts to control spread between students fail.  This was explicitly a ‘best case worst case’ estimate: that is worst case for containment within campus and best case for everything else – emphasising the need for action to ensure that the former does not happen.

Do I really believe this figure?  Well in reality, of course, if there are major campus outbreaks local lockdowns or campus quarantine would come into place before the full level of community infection took hold.  If this reaction is fast enough this would limit wider community impact, although we would never know how much as many of the knock-on infections would be untraceable to the original cause. It is conditional – we can do things ahead of time to prevent it, or later to ameliorate the worst impacts.

However, it is a robust figure in terms of order of magnitude.  In a different blog I used minimal figures for small university outbreaks (5% of students) combined with lower end winter population R and this still gives to tens of thousands of knock-on community infections for every university [Dx3].

More or less?

Returning to “More or Less”, Dr Kit Yates, who was interviewed for the programme, quite rightly examined the assumptions behind the figure, exactly what I would would do myself.  However, I would imagine he had to do so quite quickly and so in the interview there was confusion between (i) the particular scenario that gives rise the the 50,000 figure and the general assumptions of the paper as a whole and (ii) the sensitivity of the figure to the particular values of various parameters in the scenario.

The last of these, the sensitivity, is most critical: some parameters make little difference to the eventual result and others make a huge difference.  Dr Yates suggested that some of the values (each of which have low sensitivity) could be on the high side but also one (the most sensitive) that is low.   If you adjust for all of these factors the community deaths figure ends up near 100,000 (see below).  As I noted, the 50,000 figure was towards the lower end of potential scenarios.

The modelling in my paper deliberately uses a wide range of values for various parameters reflecting uncertainty and the need to avoid reliance on particular assumptions about these.  It also uses three different modelling approaches, one mathematical and two computational in order to increase reliability.  That is, the aim is to minimise the sensitivity to particular assumptions by basing results on overall patterns in a variety of potential scenarios and modelling techniques.

The detailed models need some mathematical knowledge, but the calculations behind the 50,000 figure are straightforward:

Total mortality = number of students infected
                  x  knock-on growth factor due to general population R
                  x  general population mortality

So if you wish it is easy to plug in different estimates for each of these values and see for yourself how this impacts the final figure.  To calculate the ‘knock-on growth factor due to general population R’, see “More than R – how we underestimate the impact of Covid-19 infection” [Dx4], which explains the formula (R/(1-R)) and how it comes about.

The programme discussed several assumptions in the above calculation:

  1. Rate of growth within campus: R=3 and 3.5 days inter-infection period. –  These are not assumptions of the modelling paper as a whole, which only assumes rapid spread within student bubbles and no direct spread between bubbles.  However, these are the values used in the scenario that gives rise to the 50,000 figure, because they seemed the best accepted estimate at the time.  However, the calculations only depend on these being high enough to cause widespread outbreak across the student population.  Using more conservative figures of (student) R=2 and 5-6 day inter-infection period, which I believe Dr Yates would be happy with, still means all susceptible students get infected before the end of a term  The recent SAGE report [SAGE] describes models that have peak infection in November, consonant with these values. (see also addendum 2)
  2. Proportion of students infected. –  Again this is not an assumption but instead a consequence of the overall modelling in the paper.  My own initial expectation was that student outbreaks would limit at 60-70%, the herd immunity level, but it was only as the models ran that it became apparent that cross infections out to the wider population and then back ‘reseeded’ student growth because of clumpy social relationships.  However, this is only apparent at a more detailed reading, so it was not unreasonable for More or Less to think that this figure should be smaller.  Indeed in the later blog about the issue [Dx3] I use a very conservative 5% figure for student infections, but with a realistic winter population R and get a similar overall total.
  3. General population mortality rate of 1%. – In early days data for this ranged between 1% and 5% in different countries depending, it was believed, on the resilience of their health service and other factors. I chose the lowest figure.  However, recently there has been some discussion about whether the mortality figure is falling [MOH,LP,BPG].  Explanations include temporary effects (younger demographics of infections, summer conditions) and some that could be long term (better treatment, better testing, viral mutation).  This is still very speculative with suggestions this could now be closer to 07% or (very, very speculative) even around 0.5%.  Note too that in my calculations this is about the general population, not the student body itself where mortality is assumed to be negligible.
  4. General population R=0.7. – This is a very low figure as if the rest of society is in full lockdown and only the universities open. It is the ‘best case’ part of the ‘best case worst case’ scenario. The Academy of Medical Science report “Coronavirus: preparing for challenges this winter” in July [AMS] suggests winter figures of R=1.2 (low) 1.5 (mid) and 1.8 (high). In the modelling, which was done before this report, I used a range of R values between 0.7 and 3; that is including the current best estimates.  The modelling suggested that the worst effects in terms of excess deaths due to universities occurred for R in the low ‘ones’ that is precisely the expected winter figures.

In summary, let’s look at how the above affects the 50,000 figure:

  • 1.  Rate of growth within campus – The calculation is not sensitive to this and hence not affected at all.
  • 2 and 3.  Proportion of students infected and general population mortality rate – These have a linear effect on the final calculation (some sensitivity).  If we take a reduction of 0.7 for each (using the very speculative rather than the very, very speculative figure for reduced mortality), this halves the estimated impact.
  • 4. General population R. This an exponential factor and hence the final result is very sensitive to this. It was unreasonably low, but reasonable figures tend to lead to frighteningly high impacts.  So let’s still use a very conservative figure of 0.9 (light lockdown), which multiplies the total by just under 4 (9/2.3).

The overall result of this is 100,000 rather than 50,000 deaths.

In the end you can play with the figures, and, unless you pull all of the estimates to their lowest credible figure, you will get results that are in the same range or a lot higher.

If you are the sort of person who bets on an accumulator at the Grand National, then maybe you are happy to assume everything will be the best possible outcome.

Personally, I am not a betting man.

 

Addendum 1: Key factors in assessing modelling assumptions and sensitivity

More or Less was absolutely right to question assumptions, but this is just one of a number of issues that are all critical to consider when assessing mathematical or computational modelling:

  • assumptions – values, processes, etc, implicitly or explicitly taken as given
  • sensitivity – how reliant a particular result is on the values used to create it
  • scenarios – particular sets of values that give rise to a result
  • purpose – what you are trying to achieve

I’ve mentioned the first three of these in the discussion above. However, understanding the purpose of a model is also critical particularly when so many factors are uncertain.  Sometimes a prediction has to be very accurate, for example the time when a Mars exploration rocket ‘missed’ because of a very small error in calculations.

For the work described here my own purpose was: (i) to assess how effective student bubbles need to be, a comparative judgement and (ii) to assess whether it matters or not, that is an order of magnitude judgement.    The 50K figure was part of (ii).  If this figure had been in the 10s or 100s even it could be seen to be fairly minor compared with the overall Covid picture, but 10,000, 50,000 or 100,000 are all bad enough to be worth worrying about.  For this purpose fine details are not important, but being broadly robust is.

 

Addendum 2:  Early Covid growth in the UK

The scenario used to calculate the 50K figure used the precise values of R=3 and a 3.5 day inter-infection period, which means that cases can increase by 10 times each week..  As noted the results are not sensitive to these figures and much smaller values still lead the the same overall answer.

The main reason for using this scenario is that it felt relatively conservative to assume that students post lockdown might have rates similar to overall population before awareness of Covid precautions – they would be more careful in terms of their overall hygiene, but would also have the higher risk social situations associated with being a student.

I was a little surprised therefore that, on ‘More or Less’, Kit Yates suggested that this was an unreasonably high figure because the week-on-week growth had never been more than 5 times.  I did wonder whether I had misremembered the 10x figure, from the early days of the crisis unfolding in February and March.

In fact, having rechecked the figures, they are as I remember.  I’ll refer to the data and graphs on the Wikipedia page for UK Covid data.  These use the official UK government data, but are visualised better than on Gov.UK.

UK Cases:  https://en.wikipedia.org/wiki/COVID-19_pandemic_in_the_United_Kingdom#New_cases_by_week_reported

I’m focusing on the early days of both sets of data.  Note that both new cases and deaths ‘lag’ behind actual infections, hence the peaks after lockdown had been imposed. New cases at that point typically meant people showing serious enough symptoms to be admitted to hospital, so lags from infection by say a week or more. Deaths lag by around 2-3 weeks (indeed not included after 28 days to avoid over-counting).

The two data sets are quite similar during the first month or so of the crisis as at that point testing was only being done for very severe cases that were being identified as potential Covid. So, Iet’s just look at the death figures (most reliable) in detail for the first few weeks until the lockdown kicks in and the numbers peek.

week deaths growth (rounded)
29 Feb — 6 March 1
7–13 March 8 x8
14–20 March 181 x22
21–27 March 978 x5
28 March — 3 April 3346 x3.5
4–10 April 6295 x2

Note how there is an initial very fast growth, followed by pre-lockdown slowing as people became aware of the virus and started to take additional voluntary precautions, and then peeking due to lockdown.  The numbers for initial fast phase are small, but this pattern reflects the early stages in Wuhan with initial, doubling approximately every two days before the public became aware of the virus, followed by slow down to around 3 day doubling followed by lockdown.

Indeed in the early stages of the pandemic it was common to see country-vs-country graphs of early growth with straight lines for 2 and 3 day doubling drawn on log-log axes. Countries varied on where they started on this graph, but typically lay between the two lines.  The UK effectively started at the higher end and rapidly dropped to the lower one, before more dramatic reduction post-lockdown.

It may be that Kit recalled the x5 figure (3 day doubling) is it was the figure once the case numbers became larger and hence more reliable.  However, there is also an additional reason, which I think might be why early growth was often underestimated.  In some of the first countries infected outside China their initial growth rate was closer to the 3 day doubling line. However this was before community infection and when cases were driven by international travellers from China.  These early international growths reflected post-public-precautions, but pre-lockdown growth rates in China, not community transmission within the relevant countries.

This last point is educated guesswork, and the only reason I am aware of it is because early on a colleague asked me to look at data as he thought China might be underreporting cases due to the drop in growth rate there.  The international figures were the way it was possible to confirm the overall growth figures in China were reasonably accurate.

References

[AMS] Preparing for a challenging winter 2020-21. The Academy of Medical Sciences. 14th July 2020. https://acmedsci.ac.uk/policy/policy-projects/coronavirus-preparing-for-challenges-this-winter

[BBC1] Schools and coronavirus, test and trace, maths and reality. More or Less, BBC Radio 4. 2nd September 2020.  https://www.bbc.co.uk/programmes/m000m5j9

[BBC2] Coronavirus: ‘Critical moment’ as students return to university.  BBC News.  5 September 2020.  https://www.bbc.co.uk/news/uk-54040421

[BPG] Are we underestimating seroprevalence of SARS-CoV-2? Burgess Stephen, Ponsford Mark J, Gill Dipender. BMJ 2020; 370 :m3364  https://www.bmj.com/content/370/bmj.m3364

[DW] Would student social bubbles cut deaths from Covid-19?  Jim Dickinson on Wonkhe.  28 July 2020.  https://wonkhe.com/wonk-corner/would-student-social-bubbles-cut-deaths-from-covid-19/

[DW1] Could higher education ruin the UK’s Christmas?  Jim Dickinson on Wonkhe.  4 Sept 2020.  https://wonkhe.com/blogs/could-higher-education-ruin-the-uks-christmas/

[Dx1] Working paper: Covid-19 – Impact of a small number of large bubbles on University return. Working Paper, Alan Dix. created 10 July 2020. arXiv:2008.08147 stable version at arXiv |additional information

[Dx2] Why pandemics and climate change are hard to understand, and can we help?  Alan Dix. North Lab Talks, 22nd April 2020 and Why It Matters, 30 April 2020.  http://alandix.com/academic/talks/Covid-April-2020/

[Dx3] Covid-19, the impact of university return.  Alan Dix. 9th August 2020. https://alandix.com/blog/2020/08/09/covid-19-the-impact-of-university-return/

[Dx4] More than R – how we underestimate the impact of Covid-19 infection. Alan Dix.  2nd August 2020. https://alandix.com/blog/2020/08/02/more-than-r-how-we-underestimate-the-impact-of-covid-19-infection/

[LP] Why are US coronavirus deaths going down as covid-19 cases soar? Michael Le Page. New Scientist.  14 July 2020. https://www.newscientist.com/article/2248813-why-are-us-coronavirus-deaths-going-down-as-covid-19-cases-soar/

[MOH] Declining death rate from COVID-19 in hospitals in England
Mahon J, Oke J, Heneghan C.. The Centre for Evidence-Based Medicine. June 24, 2020. https://www.cebm.net/covid-19/declining-death-rate-from-covid-19-in-hospitals-in-england/

[SAGEPrinciples for managing SARS-CoV-2 transmission associated with higher education, 3 September 2020.  Task and Finish Group on Higher Education/Further Education. Scientific Advisory Group for Emergencies. 4 September 2020. https://www.gov.uk/government/publications/principles-for-managing-sars-cov-2-transmission-associated-with-higher-education-3-september-2020

 

How much does herd immunity help?

I was asked in a recent email about the potential contribution of (partial) herd immunity to controlling Covid-19.  This seemed a question that many may be asking, so here is the original question and my reply (expanded slightly).

We know that the virus burns itself out if R remains < 1.

There are 2 processes that reduce R, both operating simultaneously:

1) Containment which limits the spread of the virus.

2) Inoculation due to infection which builds herd immunity.

Why do we never hear of the second process, even though we know that both processes act together? What would your estimate be of the relative contribution of each process to reduction of R at the current state of the pandemic in Wales?

One of the UK government’s early options was (2) developing herd immunity1.  That is you let the disease play out until enough people have had it.
For Covid the natural (raw) R number is about 3 without additional voluntary or mandated measures (depends on lots of factors).   However, over time as people build immunity, some of those 3 people who would have been infected already have been.  Once about 2/3 of the community are immune the effective R number drops below 1.  That corresponds to a herd immunity level (in the UK) of about 60-70% of the population having been infected.  Of course, we do not yet know how long this immunity will last, but let’s be optimistic and assume it does.
The reason this policy was (happily) dropped in the UK was the realisation that this would need about 40 million people to catch the virus, with about 4% of these needing intensive care.  That is many, many times the normal ICU capacity, leading to (on the optimistic side) around half a million deaths, but if the health service broke under the strain many times that number!
In Spain (with one of the larger per capita outbreaks) they ran an extensive antibody testing study (that is randomly testing a large number of people whether or not they had had any clear symptoms), and found only about 5% of people showed signs of having had the virus overall, with Madrid closer to 10%.  In the UK estimates are of a similar average level (but without as good data), rising to maybe as high as 17% in London.
Nationally these figures (~5%) do make it slightly easier to control, but this is far below the reduction needed for relatively unrestricted living (as possible in New Zealand, which chose a near eradication strategy)   In London the higher level may help a little more (if it proves to offer long-term protection).  However, it is still well away from the levels needed for normal day-to-day life without still being very careful (masks, social distancing, limited social gatherings), however it does offer just a little ‘headroom’ for flexibility.  In Wales the average level is not far from the UK average, albeit higher in the hardest hit areas, so again well away from anything that would make a substantial difference.
So, as you see it is not that (2) is ignored, but, until we have an artificial vaccine to boost immunity levels, relying on herd immunity is a very high risk or high cost strategy.  Even as part of a mixed strategy, it is a fairly small effect as yet.
In the UK and Wales, to obtain even partial herd immunity we would need an outbreak ten times as large as we saw in the Spring, not a scenario I would like to contemplate 🙁
This said there are two caveats that could make things (a little) easier going forward:
1)  The figures above are largely averages, so there could be sub-communities that do get to a higher level.  By definition, the communities that have been hardest hit are those with factors (crowded accommodation, high-risk jobs, etc.) that amplify spread, so it could be that these sub-groups, whilst not getting to full herd-immunity levels, do see closer to population spread rates in future hence contributing to a lower average spread rate across society as a whole.  We would still be a long way from herd immunity, but slower spread makes test, track and trace easier, reduces local demand on health service, etc.
2)  The (relatively) low rates of spread in Africa have led to speculation (still very tentative) that there may be some levels of natural immunity from those exposed to high levels of similar viruses in the past.  However, this is still very speculative and does not seem to accord with experience from other areas of the world (e.g. Brazilian favelas), so it looks as though this is at most part of a more complex picture.
I wouldn’t hold my breath for (1) or (2), but it may be that as things develop we do see different strategies in different parts of the world depending on local conditions of housing, climate, social relationships, etc.

Update

Having written the above, I’ve just heard about the following that came out end of last week in BMJ, which suggests that there could be a significant number of mild cases that are not
detected on standard blood test as having been infected.
Burgess StephenPonsford Mark JGill DipenderAre we underestimating seroprevalence of SARS-CoV-2? https://www.bmj.com/content/370/bmj.m3364
  1. I should say the UK government now say that herd immunity was never part of their planning, but for a while they kept using the term! Here’s a BBC article about the way herd immunity influenced early UK decisions, a Guardian report that summarises some of the government documents that reveal this strategy, and a Politco article that reports on the Chief Scientific Adviser Patrick Vallance ‘s statement that he never really meant this was part of government planning.  His actual words on 12th March were “Our aim is not to stop everyone getting it, you can’t do that. And it’s not desirable, because you want to get some immunity in the population. We need to have immunity to protect ourselves from this in the future.”  Feel free to decide for yourself what ‘desirable‘ might have meant.[back]

Covid-19, the impact of university return

For many reasons, it is important for universities to re-open in the autumn, but it is also clear that this is a high-risk endeavour: bringing around 2% of the UK population together in close proximity for 10 to 12 weeks and then re-dispersing them at Christmas.

When I first estimated the actual size of the impact I was, to be honest, shocked; it was a turning point for me. With an academic hat on I can play with the numbers as an intellectual exercise, but we are talking about many, many thousands of lives at risk, the vast majority outside the university itself, with the communities around universities most at risk.

I have tried to think of easy, gentle and diplomatic ways of expressing this, but there are none; we seem in danger of creating killing zones around our places of learning.

At the very best, outbreaks will be detected early, and instead of massive deaths we will see substantial lockdowns in many university cities across the UK with the corresponding social and economic costs, which will create schisms between ‘town and gown’ that may poison civic relationships for years to come.

In the early months of the year many of us in the university sector watched with horror as we watched the Covid-19 numbers rising and could see where this would end. The eventual first ‘wave’ and its devastating death toll did not need sophisticated modelling to predict; in the intervening months it has played out precisely as expected. At that point the political will was clearly set and time was short; there was little we could do but shake our heads in despair and feel the pain of seeing our predictions become reality as the numbers grew, each number a person, each person a community.

Across the sector, many are worried about the implications of the return of students and staff in the autumn, but structurally the nature of the HE sector in the UK makes it near impossible even for individual universities to take sufficient steps to mitigate it, let alone individual academics.

Doing the sums

For some time, universities across the UK have been preparing for the re-opening, working out ways to reduce the risk. There has been a mathematical modelling working group trying to assess the impact of various measures, as well as much activity at individual institutions.  It appears too that SAGE has highlighted that universities pose a potential risk [SN], but this seems to have gone cold and universities are coping as best they can with apparently no national plan. Universities UK have issued guidance to universities on what to do as they emerge from lockdown [UUKa], but it does not include an estimate of the scale of the problem.

As I said, the turning point for me came when I realised just how bad this could be. As with the early national growth pattern, it does not require complex mathematics to assess, within rough ranges, the potential impact; and even the most conservative estimates are terrifying.

We know from freshers’ flu that infections spread quickly amongst the student community.  The social life is precisely why many students relocate to distant cities.  Without strong measures to control student infections it is clear that Covid-19 will spread rapidly on campuses, leading to thousands of cases in each university. Students themselves are at low (though not zero) risk of dying or having serious complications from Covid-19, but if there is even small ‘leakage’ into the surrounding community (via university staff, transport systems, stay-at-home students or night life), then the impact is catastrophic.

For a mid-sized university of 20,000 students, let’s say only 1 in 20 become infected during the term; that is around 1,000 student cases. As a very conservative estimate, let’s assume just one community infection for every 10 infected students. If city bars are open this figure will almost certainly be much higher, but we’ll take a very low estimate. In this case, we are looking at 100 initial community cases.

Now 100 additional cases is already potentially enough to cause a handful of deaths, but we have got used to trading off social benefits against health costs; for any activity there is always a level of risk that we are prepared to accept.

However, the one bit of mathematics you do need to know is the way that a relatively small R number still leads to a substantial number of cases. For example, an R of 0.9 means for every initial infection the total number of infections is actually 10 times higher (in general 1/(1-R), see [Dx1]).  When R is greater than 1 the effect is worse still, with the impact only limited when some additional societal measure kicks in, such as a vaccine or local lockdown.

A relatively conservative estimate for R in the autumn is 1.5 [AMS]. For R =1.5, those initial 100 community cases magnify to over 10,000 within 5 weeks and more than 600,000 within 10 weeks. Even with the most optimistic winter rate of 1.2, those 100 initial community infections will give rise to 20,000 cases by the end of a term.

That is for a single university.

With a mortality rate of 1% and the most optimistic figures, this means that each university will cause hundreds of deaths.  In other words, the universities in the UK will collectively create as many infections as the entire first wave.  At even slightly less optimistic figures, the impact is even more devastating.

Why return at all?

Given the potential dangers, why are universities returning at all in the autumn instead of continuing with fully online provision?

In many areas of life there is a trade-off to be made between, on the one hand, the immediate Covid-19 health impacts and, on the other, a variety of issues: social, educational, economic, and also longer term and indirect mental and physical health implications. This is no less true when we consider the re-opening of universities.

Social implications: We know that the lockdown has caused a significant increase in mental health problems amongst young people, for a variety of reasons: the social isolation itself, pressures on families, general anxiety about the disease, and of course worries about future education and jobs. Some of the arguments are similar to those for schools except that universities do not provide a ‘child minding’ role. Crucially, for both schools and universities, we know that online education is least effective for those who are already most economically deprived, not least because of continued poor access to digital technology. We risk creating a missed generation and deepening existing fractures in civil society.

Furthermore, the critical role of university research has been evident during the Covid crisis, from the development of new treatments to practical use of infrastructure for rapid production of PPE. Ongoing, the initial wave has emphasised the need for more medical training.  Of course, both education and research will also be critical for ‘post-Covid’ recovery.

Economic situation: Across the UK, universities generate £95 billion in gross output and support nearly a million jobs (2014–2015 data, [UUKb]).  Looking at Wales in particular, the HE sector “employs 17,300 full-time members of staff and spending by students and visitors supports an estimated 50,000 jobs across Wales”. At the same time the sector is particularly vulnerable to the effects of Covid-19 [HoC]. Universities across the UK were already financially straitened due to a combination of demographics and Brexit, leading to significant cost-cutting including job cuts [BBCa].  Covid-19 has intensified this; a Wales Fiscal Analysis briefing paper in May [WFA] suggests that Welsh universities may see a shortfall due to Covid-19 of between £100m and £140m. More recent estimates suggest that this may be understating the problem, if anything. Cardiff University alone is warning of a £168m fall in income [WO] and Sir Deian Hopkin, former Vice Chancellor of London South Bank and advisor to the Welsh Assembly, talks of a “perfect storm” in the university system [BBCb].

Government support has been minimal. The rules for Covid-19 furlough meant that universities were only able to take minimal advantage of the scheme. There has been some support in terms of general advice, reducing bureaucratic overheads and rescheduling payments to help university cashflow, but this has largely been within existing budgets, not new funding. The Welsh government has announced an FE/HE £50m support package with £27m targeting universities [WG], but this is small compared with predicted losses.

Universities across the UK have already cut casual teaching (the increase in zero-hour contracts has been a concern in HE for some years) and many have introduced voluntary severance schemes.  At the same time the competition over UK students has intensified in a bid to make up for reduced international numbers. Yet one of the principal ways to attract students is to maximise the amount of in-person teaching.

What is being done

To some extent, as in so many areas, coronavirus has exposed the structural weaknesses that have been developing in the university sector for the past 30 years. Universities have been forced to compete constantly and are measured in terms of student experience above educational impact. Society as a whole has been bombarded with messages that focus on individual success and safety rather than communal goals, and most current students have grown up in this context. This focus has been very evident in the majority of Covid-19 information and reporting [Dx2].

Everything we do is set against this backdrop, which both fundamentally limits what universities are able to do individually, and at the same time makes them responsible.  This is not to say that universities are not sharing good practice, both in top down efforts such as through Universities UK and direct contacts between senior management, and from the bottom up via person-to-person contacts and through subject-specific organisations such as CPHC.

Typically, universities are planning to retain some level of in-person teaching for small tutorials while completely or largely moving large-class activities such as lectures to online delivery, some live, some recorded. This will help to remove some student–student contact during teaching. Furthermore, many universities have discussed ways in which students could be formed into bubbles. At a large scale that could involve having rooms or buildings dedicated to a particular subject/year group for a day.  At a finer scale it has been suggested that students could be grouped into social/study bubbles of around ten or a dozen who are housed together in student accommodation and are also grouped for study purposes.

My own modelling of student bubbles [Dx3] suggests that while reducing the level of transmission, the impact is rapidly eroded if the bubbles are at all porous.  For example, if the small bubbles break and transmission hits whole year groups (80–200 students), the impact on outside communities becomes unacceptable. For students on campus the temptation to break these bubbles will be intense, both at an individual level and through bars and similar venues.  For those living at home, the complexities are even greater, and crucially they are a primary vector into the local community.

Combined with, or instead of, social/study bubbles some universities are looking at track and trace. Some are developing their own solutions both in terms of apps and regular testing programmes, but more will use normal health systems.  In Wales, for example, Public Health Wales regard university staff as a priority group for Covid-19 testing, although this is reactive (symptoms-based) rather than proactive (regular testing).

Dr Hans Kluge, the Europe regional director for the World Health Organization and others have warned that global surges across the world, including in Europe, are being driven by infections amongst younger people [BBCc].  He highlights the need to engage young people more in the science, a call that is reflected in a recent survey by the British Science Association which found that nine out of ten young people felt ignored by scientists and politicians [BSA].

As of 27th July, the UK Department for Education were “working to” two scenarios “Effective containment and testing” (reduce growth on campuses and reactive local lockdowns) and “On and off restrictions” (delaying all in-person teaching until January) [DfE].  Jim Dickinson has collated and analysed current advice and work at various government and advisory bodies including the DfE report above and SAGE, but so far there seems to be no public quantification of the risk [JD].

What can we do?

I think it is fair to say that the vast majority of high-level advice from national governments and pan-University bodies, and most individual university thinking, has been driven by safety concerns for students and staff rather than the potentially far more serious implications for society at large.

As with so many aspects of this crisis, the first step is to recognise there is a problem.

Within universitiesacknowledge that the risk level will be far higher than in society at large because the case load will be far higher. How much higher will depend on mitigating measures, but whereas general population levels by the start of term may be as low as 1 in 5,000, the rate amongst students will be an order of magnitude higher, comparable with general levels during the peak of the ‘first wave’. This means that advice, particularly for at risk groups, which is targeted at national levels, needs to be re-thought within the university context. This means that advice that is targeted at national levels, particularly for at risk groups, needs to be re-thought within the university context.  Individual vulnerable students are already worried [BBCd]. Chinese and Asian students seem more aware of the personal dangers and it is noticeable that both within the UK and in the US the universities with the greatest number of international students are more risk averse. University staff (academics, cleaners, security) will include more at risk individuals than the student body. It is hard to quantify, but the risk level will considerably higher than, say, a restaurant or pub, though of course lower than for front line medical staff. Even if it is ‘safe’ for vulnerable groups to come out of shielding in general society, it may not be safe in the context of the university. This will be difficult to manage: even if the university does not force vulnerable staff to return, the long-term culture of vocational commitment may make some people take unacceptable risks.

Outside the universities, local councils, national governments and communities need to be aware of the increased risks when the universities reopen, just as seaside towns have braced themselves for tourist surges post-lockdown. While SAGE has noted that universities may be an ‘amplifier’, the extent does not appear (at least publicly) to have been quantified.  In Aberdeen recently a cluster around a small number of pubs has caused the whole city to return to lockdown, and it is hard to imagine that we won’t see similar incidents around universities. This may lead to hard decisions, as has been discussed, between opening schools or pubs [BBCe] – city centre bars may well need to be re-thought. Universities benefit communities substantially both economically and educationally. For individual universities alone the costs of, say, weekly testing of students and staff would be prohibitive, but when seen in terms of regional or national health protection these may well be worthwhile. Although this is a ‘for example’ it could well be critical given the likelihood of large numbers of asymptomatic student cases.

Educate students – this is of course what we do as universities!  Covid-19 will be a live topic for every student, but they may well have many of the misconceptions that permeate popular discourse.  Can we help them become more aware of the aspects that connect to their own disciplines and hence to become ambassadors of good practice amongst their peers? Within maths and computing we can look at models and data analysis, which could be used in other scientific areas where these are taught.  Medicine is obvious and design and engineering students might have examples around PPE or ventilators. In architecture we can think about flows within buildings, ventilation, and design for hygiene (e.g. places to wash your hands in public spaces that aren’t inside a toilet!). In literature, there is pandemic fiction from Journal of the Plague Year to La Peste, and in economics we have examples of externalities (and if you leave externalities until a specialised final year option, rethink a 21st century economics syllabus!).

Time to act

On March 16, I posted on Facebook, “One week left to save the UK – and WE CAN DO IT.” Fortunately, we have more time now to ensure a safe university year but we need to act immediately to use that time effectively. We can do it.

References

[AMS] The Academy of Medical Sciences. Preparing for a challenging winter 2020-21. 14th July 2020. https://acmedsci.ac.uk/policy/policy-projects/coronavirus-preparing-for-challenges-this-winter

[BBCa] Cardiff University to cut 380 posts after £20m deficit. BBC News. 12th Feb 2019.  https://www.bbc.co.uk/news/uk-wales-47205659

[BBCb] Coronavirus: Universities’ ‘perfect storm’ threatens future.  Tomos Lewis  BBC News. 7 August 2020.  https://www.bbc.co.uk/news/uk-wales-53682774

[BBCc] WHO warns of rising cases among young in Europe. Lauren Turner, BBc New live reporting, 10:05am 29th July 2020. https://www.bbc.co.uk/news/live/world-53577222?pinned_post_locator=urn:asset:59cae0e7-5d3d-4e35-94ec-1895273ed016

[BBCd] Coronavirus: University life may ‘pose further risk’ to young shielders
Bethany Dawson. BBC News. 6th August 2020. https://www.bbc.co.uk/news/disability-53552077

[BBCe]  Coronavirus: Pubs ‘may need to shut’ to allow schools to reopen. BBC News. 1st August 2020.  https://www.bbc.co.uk/news/uk-53621613

[BG]  Colleges reverse course on reopening as pandemic continues.  Deirdre Fernandes, Boston Globe, updated 2nd August 2020.  https://www.bostonglobe.com/2020/08/02/metro/pandemic-continues-some-colleges-reverse-course-reopening/

[BSA] New survey results: Almost 9 in 10 young people feel scientists and politicians are leaving them out of the COVID-19 conversation. British Science Association. (undated) accessed 7/8/2020.  https://www.britishscienceassociation.org/news/new-survey-results-almost-9-in-10-young-people-feel-scientists-and-politicians-are-leaving-them-out-of-the-covid-19-conversation

[DfE] DfE: Introduction to higher education settings in England, 1 July 2020 Paper by the Department for Education (DfE) for the Scientific Advisory Group for Emergencies (SAGE). Original published 24th July 2020 (updated 27th July 2020).  https://www.gov.uk/government/publications/dfe-introduction-to-higher-education-settings-in-england-1-july-2020

[Dx1]  More than R – how we underestimate the impact of Covid-19 infection. . Dix (blog).  2nd August 2020  https://alandix.com/blog/2020/08/02/more-than-r-how-we-underestimate-the-impact-of-covid-19-infection/

[Dx2] Why pandemics and climate change are hard to understand, and can we help? A. Dix. North Lab Talks, 22nd April 2020 and Why It Matters, 30 April 2020 http://alandix.com/academic/talks/Covid-April-2020/

[Dx3] Covid-19 – Impact of a small number of large bubbles on University return. Working Paper. A. Dix. July 2020.  http://alandix.com/academic/papers/Covid-bubbles-July-2020/

[HEFCW] COVID-19 impact on higher education providers: funding, regulation and reporting implications.  HEFCW Circular, 4th May 2020 https://www.hefcw.ac.uk/documents/publications/circulars/circulars_2020/W20%2011HE%20COVID-19%20impact%20on%20higher%20education%20providers.pdf

[HoC]  The Welsh economy and Covid-19: Interim Report. House of Commons Welsh Affairs Committee. 16th July 2020. https://committees.parliament.uk/publications/1972/documents/19146/default/

[JD]  Universities get some SAGE advice on reopening campuses. Jim Dickinson, WonkHE, 25th July 2020.  https://wonkhe.com/blogs/universities-get-some-sage-advice-on-reopening-campuses/

[SN]  Coronavirus: University students could be ‘amplifiers’ for spreading COVID-19 around UK – SAGE. Alix Culbertson. Sky News. 24th July 2020. https://news.sky.com/story/coronavirus-university-students-could-be-amplifiers-for-spreading-covid-19-around-uk-sage-12035744

[UUKa] Principles and considerations: emerging from lockdown.   Universities UK, June 2020. https://www.universitiesuk.ac.uk/policy-and-analysis/reports/Pages/principles-considerations-emerging-lockdown-uk-universities-june-2020.aspx

[UUKb] https://www.universitiesuk.ac.uk/policy-and-analysis/reports/Pages/economic-impact-universities-2014-15.aspx

[WFA] Covid-19 and the Higher Education Sector in Wales (Briefing Paper). Cian Siôn, Wales Fiscal Analysis, Cardiff University.  14th May 2020.  https://www.cardiff.ac.uk/__data/assets/pdf_file/0010/2394361/Covid_FINAL.pdf

[WG]  Over £50 million to support Welsh universities, colleges and students.    Welsh Government press release.  22nd July 2020.  https://gov.wales/over-50-million-support-welsh-universities-colleges-and-students

[WO] Cardiff University warns of possible job cuts as it faces £168m fall in income. Abbie Wightwick, Wales Online. 10th June 2020.  https://www.walesonline.co.uk/news/education/cardiff-university-job-losses-coronavirus-18393947

 

 

 

 

 

 

More than R – how we underestimate the impact of Covid-19 infection

We have got so used to seeing R numbers quoted. However, taking this at its immediate value means we underestimate the impact of our individual and corporate actions.

Even with a lockdown R value of 0.9 when the disease is ‘under control’, a house party that leads to 10 initial infections will ultimately give rise to a further 90 cases, so is actually likely to lead to an additional Covid-19 death, probably totally unrelated to anyone at the original house party.

This multiplication factor is far bigger than the apparent 0.9 figure suggests and is at first counter-intuitive. This difference between the apparent figure and the real figure can easily lead to complacency.

If you have been following the explanations in the media you’ll know that R is the average number of further people to whom an infected person passes the disease. If R is greater than one, the disease increases exponentially – an epidemic – if R is less than one the disease gradually decays. In most countries the R-value before lockdown was between 2 and 3 (out of control), and during lockdown in the UK it reduced to a figure between 0.7 and 0.9 (slow decay).

However, note that this R value is about the average number of people directly infected by a carrier.

First it is an average – in reality most people infect fewer than the R number, but a few people infect a lot more, especially if the person has a large social network and is asymptomatic or slow to develop symptoms. This is why some news articles are picking up discussions of the ‘k’ factor1, a measure of the extent to which there is variability.

Secondly, this is about direct infections. But of course if you infect someone, they may infect another person, and so on. So if you infect 3 people, and they each infect 3 more, that is 9 second order contacts.

Thirdly, the timescale of this infection cycle is 3–4 days, about half a week. This means that an R of 3 leads to approximately 9 times as many cases two weeks later, or doubling about every 2½ days, just what we saw in the early days of Covid-19 in the UK.

Let’s look at the effect of these indirect infections for an R below 1, when the disease is under control.

As a first example let’s take R=0.5, which is far smaller than almost anywhere has achieved even under lockdown, as an extreme example to begin with. Let’s start off with 64 cases (chosen to make the numbers add up easily!). These 64 infect 32 others, these infect 16 more, each time halving. The diagram shows this happening with two cycles of infection each week and the cases peter out after about 4 weeks. However, in that time a further 63 people have been infected.

If we do the same exercise with R = 0.9 and start off with 100 cases, we get 90 people infected from these initial 100, then a further 81 second order infections, 72 after the next cycle, and then in the following cycles (rounding down each time) 64, 57, 51, 45, 40, 36, 32, 28, 25, 22, 19, 17, 15, 13, 11, 9, 8, 7, 6, 5, 4, 3, 2, 1. That is, after 15 weeks we have a further 763 cases. On average (rather than rounding down), it is a little higher, 900 additional cases.

In general the number of additional cases for each seed infection is R/(1-R): 9 for R=0.9; 2.3 for R=0.7.  This is very basic and well-known arithmetic series summation, but the large sizes can still be surprising even when one knows the underlying maths well.

Things get worse once R becomes greater than 1. If R is exactly 1 there is on average 1 new case for each infected person case.  So if there is one ‘seed’ case, then in each succeeding week there will be two new cases for ever. In reality there will not be an infinite number of cases as eventually there will be a vaccine, further lockdown, or something to clamp down on new cases, but there is no natural limit when the new cases peter out.

Mid-range estimates in the UK suggest that during the winter we may see an R of 1.52. This is assuming that social distancing measures and effective track-and-trace are in place, but where winter weather means that people are indoors more often and transmission is harder to control. The lower bound figure being used is 1.2.

If we look over just a 5-week window, with R=1.2 each seed case leads to nearly 25 additional cases during the period; with R=1.5 this rises to over 100 new cases.  Over a 10-week period (a university term), these figures are around two hundred new cases with R=1.2 or six and half thousand for R=1.5.

So next time you see R=0.7 think two and half, when you see R=0.9 think ten, and when you see R=1.5 think thousands.

The last of these is crucial: taking into account a mortality rate of around 1%, each avoided infection this coming winter will save around ten lives.

 

  1. For example, BBC News: Coronavirus: What is the k number and can superspreading be stopped? Rebecca Morelle, 6 June 2020[back]
  2. The Academy of Medical Sciences. Preparing for a challenging winter 2020-21. 14th July 2020 [back]

Free AI book and a new one coming …

Yes a new AI book is coming … but until then you can download the first edition for FREE 🙂

Many years ago Janet Finlay and I wrote a small introduction to artificial intelligence.  At the time there were several Bible-sized tomes … some of which are still the standard textbooks today.  However, Janet was teaching a masters conversion course and found that none of these books were suitable for taking the first steps on an AI journey, especially for those coming from non-computing disciplines.

Over the years it faded to the back of our memories, with the brief exception of the time when, after we’d nearly forgotten it, CRC Press issued a Japanese translation.  Once or twice the thought of doing an update arose, but quickly passed.  This was partly because our main foci were elsewhere, but also, at the danger of insulting all my core-AI friends, not much changed in core AI for many years!

Coming soon … Second Edition

Of course over recent years things have changed dramatically, hence my decision, nearly 25 years on, to create a new edition maintaining the aim to give a rich but accessible introduction, but capturing some of the recent trends and giving these a practical and human edge.  Following the T-model of teaching, I’d like to help both newcomer and expert gain a broad perspective of the issues and landscape, whilst giving enough detail for those that want to delve into a more specific area.

A Free Book and New Resources

In the mean time the publisher, Taylor & Francis/CRC has agreed to make the PDF of the first edition available free of charge  I have updated some of the code examples from the first edition and will be incrementally adding new material to the second edition micro-site including slides, cases studies, video and interactive materials.  If you’d like to teach using this please let me know your views on the topics and also if there are areas where you’d like me to create preliminary material with greater urgency.  I won’t promise to be able to satisfy everyone, but can use this to adjust my priorities.

Why now?

The first phase of change in AI was driven by the rise of big data and the increasing use of forms of machine learning to drive adverts, search results and social media.  Within user interface design, many of the fine details of colour choices and screen layout are now performed using A–B testing …sight variants of interfaces delivered to millions of people – shallow, without understanding and arguably little more than bean counting, but in numerous areas vast data volume has been found to be ‘unreasonably effective‘ at solving problems that were previously seen to be the remit of deep AI.

In the last few years deep learning has taken over as the driver of AI research and often also media hype.  Here it has been the sheer power of computation, partly due to Moores’ Law with computation nearly a million times faster than it was when that first edition was written nearly 25 years ago.  However, it has also been enabled by cloud computing allowing large numbers of computers ti efficiently attack a single problem.  Algorithms that might have been conceived of but dismissed as impractical in the past have become commonplace.

Alongside this has been a dark side of AI, from automated weapons and mass surveillance, to election rigging and the insidious knowledge that large corporations have gathered through our day-to-day web interactions.  In the early 1990s I warned of the potential danger of ethnic and gender bias in black-box machine learning and I’ve returned to this issue more recently as those early predictions have come to pass.

Across the world there are new courses running or being planned and people who want to know more.  In Swansea we have a PhD programme on people-first AI/big data, and there is currently a SIGCHIItaly workshop call out for Teaching HCI for AI: Co-design of a Syllabus. There are several substantial textbooks that offer copious technical detail, but can be inaccessible for the newcomer or those coming from other disciplines.  There are also a number of excellent books that deal with the social and human impact of AI, but without talking about how it works.

I hope to be able to build upon the foundations that Janet and I established all those years ago to create something that fills a crucial gap: giving a human-edge to those learning artificial intelligence from a computing background and offering an accessible technical introduction for those approaching the topic from other disciplines.

 

 

Software for 2050

New Year’s resolutions are for a year ahead, but with the start of a new decade it is worth looking a bit further.
How many of the software systems we use today will be around in 2050 — or even 2030?
Story 1.  This morning the BBC reported that NHS staff need up to 15 different logins to manage ‘outdated’ IT systems and I have seen exactly this in a video produced by a local hospital consultant. Another major health organisation I talked to mentioned that their key systems are written in FoxBase Pro, which has not been supported by Microsoft for 10 years.
Story 2.  Nearly all worldwide ATM transactions are routed through systems that include COBOL code (‘natural language’ programming of the 1960s) … happily IBM still do support CICS, but there is concern that COBOL expertise is literally dying out.
Story 3.  Good millennial tech typically involves an assemblage of cloud-based services: why try to deal with images when you have Flickr … except Flickr is struggling to survive financially; why have your own version control system when you can use Google Code, except Google Code shut down in 2016 after 10 years.
Story 3a.  Google have a particularly bad history of starting or buying services and then dropping them: Freebase (sigh), Revolv Hub home automation, too many to list. They are doing their best with AngularJS, which has a massive uptake in hi-tech, and is being put into long-term maintenance mode — however, ‘long-term’ here will not mean COBOL long-term, just a few years of critical security updates.
Story 4.  Success at last. Berners-Lee did NOT build the web on cutting edge technology (an edge of sadness here as hypertext research, including external linkage, pretty much died in 1994), and because of this it has survived and probably will still be functioning in 2050.
Story 5.  I’m working with David Frohlich and others who have been developing slow, meaningful social media for the elderly and their families. This could potentially contribute to very long term domestic memories, which may help as people suffer dementia and families grieve after death. However, alongside the design issues for such long-term interaction, what technical infrastructure will survive a current person’s lifetime?
You can see the challenge here.  Start-ups are about creating something that will grow rapidly in 2–5 years, but then be sold, thrown away or re-engineered from scratch.  Government and health systems need to run for 30 years or more … as do our personal lives.
What practical advice do we give to people designing now for systems that are likely to still be in use in 2050?

On the edge of chaos

Running in the early morning, the dawn sun drives a burnt orange road across the bay. The water’s margin is often the best place to tread, the sand damp and solid, sound underfoot, but unpredictable. The tide was high and at first I thought it had just turned, the damp line a full five yards beyond the edge of the current waves. Some waves pushed higher and I had to swerve and dance to avoid the frothing edge, others lower, wave following wave, but in longer cycles, some higher, some lower.

It was only later I realised the tide was still moving in, the damp line I had seen as the zenith of high tide, had merely been the high point of a cycle and I had run out during a temporary low. Cycles within cycles, the larger cycles predictable and periodic, driven by moon and sun, but the smaller ones, the waves and patterns of waves, driven by wind and distant storms thousands of miles away.

I’m reading Kate Raworth’s Doughnut Economics. She describes the way 20th century economists (and many still) were wedded to simple linear models of closed processes, hence missed the crucial complexities of an interconnected world, and so making the (predictable) crashes far worse.

I was fortunate in that even in school I recall watching the BBC documentary on chaos theory and then attending an outreach lecture at Cardiff University, targeted at children, where the speaker was an expert in Chaos and Catastrophe Theory giving a more mathematical treatment. Ideas of quasi-periodicity, non-linearity, feedback, phase change, tipping points and chaotic behaviour have been part of my understanding of the world since early in my education.

Now-a-days ideas of complexity are more common; Hollywood embraced the idea that the flutter of a butterfly wing could be the final straw that causes a hurricane. This has been helped in no small part by the high-profile of the Santa-Fe Institute and numerous popular science books. However, only recently I was with a number of academics in computing and mathematics, who had not come across ‘criticality’ as a term.

Criticality is about the way many natural phenomena self-organise to be on the edge so that small events have a large impact. The classic example is a pile of sand: initially a whole bucketful tipped on the top will just stay there, but after a point the pile gets to a particular (critical) angle, where even a single grain may cause a minor avalanche.

If we understand the world in terms of stable phenomena, where small changes cause small effects, and things that go out of kilter are brought back by counter effects, it is impossible to make sense of the wild fluctuations of global economics, political swings to extremism, and cataclysmic climate change.

One of the things ignored by some of the most zealous proponents of complexity is that many of the phenomena that we directly observe day-to-day do in fact follow the easier laws of stability and small change. Civilisation develops in parts of the world that are relatively stable and then when we modify the world and design artefacts within it, we engineer things that are understandable and controllable, where simple rules work. There are times when we have to stare chaos in the face, but where possible it is usually best to avoid it.

lovefibre – waves

However, even this is changing. The complexity of economics is due to the large-scale networks within global markets with many feedback loops, some rapid, some delayed. In modern media and more recently the internet and social media, we have amplified this further, and many of the tools of big-data analysis, not least deep neural networks, gain their power precisely because they have stepped out of the world of simple cause and effect and embrace complex and often incomprehensible interconnectivity.

The mathematical and computational analyses of these phenomena are not for the faint hearted. However, the qualitative understanding of the implications of this complexity should be part of the common vocabulary of society, essential to make sense of climate, economics and technology.

In education we often teach the things we can simply describe, that are neat and tidy, explainable, where we don’t have to say “I don’t know”. Let’s make space for piles of sand alongside pendulums in physics, screaming speaker-microphone feedback in maths, and contingency alongside teleological inevitability in historic narrative.

Paying On Time – universities are failing

Universities are not living up to Government prompt payment targets.  As many suppliers will be local SMEs this threatens the cashflow of businesses that may be teetering on the edge, and the well being of local economies.

I’ve twice in the last couple of months been hit by university finance systems that have a monthly payment run so that if a claim or invoice is not submitted by a certain date, often the first day or two of the month, then it is not paid until the end of the following month, leading to a seven week delay in payment.  This is despite Government guidelines for a normal 30 day payment period and to aim for 80% payment within 5 working days.

I’d like to say these are rare cases, but are sadly typical of university payment and expense systems.  In some cases this is because one is being treated as a casual employee, so falling into payroll systems.  However, often the same systems are clearly being used for commercial payments.  This means that if a supplier misses a monthly deadline they may wait nearly two months for payment … and of course if they are VAT registered may have already had to pay the VAT portion to HMRC before they actual receive the payment.

The idea of monthly cheque runs is a relic of the 1970s when large reels of magnetic tapes had to be mounted on refrigerator-sized machines and special paper had to be loaded into line-printers for cheque runs.  In the 21st century when the vast proportion of payments are electronic, it is an embarrassing and unethical anachronism.

As well as these cliff-edge deadline issues, I’ve seen university finance systems who bounce payments to external suppliers if data is on an out of date form, even if the form was provided in error by a member of university staff.

Even worse are universities finance systems which are organised so that when there is a problem in payment, for example, a temporary glitch in electronic bank payments, instead of retrying the payment, or informing the payee or relevant university contact, the system simply ignores it leaving it in limbo.  I’ve encountered missing payments of this kind up to a year after the original payment date.  If one were cynical one might imagine that they simply hope the supplier will never notice.

The issue of late payments became a major issue a few years ago.  Following the recession, many SMEs were constantly teetering on the edge of bankruptcy, yet larger firms were lax in paying promptly knowing that they were in a position of power (e.g. see “Getting paid on time” issued by the Department for Business, Innovation & Skills, February 2012).

Five years on this is still a problem.  In April last year The Independent estimated that British SMEs were owed 44.6 billion in late or overdue payments(see “The scourge of late payment“).  There is now mandatory reporting of payment processes for larger companies, and recent returns showed that some companies missed prompt payment up to 96% of the time, with bad performers including major names such as Deloitte (see “Ten of the UK’s big businesses that fail to pay suppliers on time get named and shamed by the Government“).

There is also a voluntary “Prompt Payment Code“, but, amongst the signatories, there are only two universities (Huddersfield and Westminster) and three colleges.

Universities are often proud of the way they support local economies and communities: being major employers and often offering advice to local businesses.  However, in respect to prompt payment they are failing those same communities.

So, well done Huddersfield and Westminster, and for the rest of the university system – up your game.

physigrams – modelling the device unplugged

Physigrams get their own micro-site!

See it now at at physicality.org/physigrams

Appropriate physical design can make the difference between an intuitively obvious device and one that is inscrutable.  Physigrams are a way of modelling and analysing the interactive physical characteristics of devices from TV remotes to electric kettles, filling the gap between foam prototypes and code.

Sketches or CAD allow you to model the static physical form of the device, and this can be realised in moulded blue foam, 3D printing or cardboard mock-ups.  Prototypes of the internal digital behaviour can be produced using tools such as Adobe Animate, proto.io or atomic or as hand-coded using standard web-design tools.  The digital behaviour can also be modelled using industry standard techniques such as UML.

  

Physigrams allow you to model the ‘device unplugged’ – the pure physical interaction potential of the device: the ways you can interact with buttons, dials and knobs, how you can open, slide or twist movable elements.  These physigrams can be attached to models of the digital behaviour to understand how well the physical and digital design compliment one another.

Physigrams were developed some years ago as part of the DEPtH project., a collaboration between product designers at Cardiff School of Art and Design and  computer scientists at Lancaster University. Physigrams have been described in various papers over the years.  However, with TouchIT ,our book on physicality and design (eventually!) reaching completion and due out next year, it felt that physigrams deserved a home of their own on the web.

The physigram micro-site, part of physicality.org includes descriptions of physical interaction properties, a complete key to the physigram notation, and many examples of physigrams in action from light switches, to complete control panels and novel devices.

Timing matters!

How long is an instant? The answer, of course, is ‘it depends’, but I’ve been finding it fascinating playing on the demo page for AngularJS tooltips. and seeing what feels like ‘instant’ for a tooltip.

The demo allows you to adjust the md-delay property so you can change the delay between hovering over a button and the tooltip appearing, and then instantly see what that feels like.

Try it yourself, set a time and then either move over the button as if you were about to click t, or wondering what it does, or simply pass over it as if you were moving your pointer to another part of the page.
 
If the delay is too short (e.g. 0), the tooltip flickers as you simply pass over the icon.
 
If you want it as a backup for when someone forgets the action, then something longer about a second is fine – the aim is to be there only if the user has that moment doubt.
 
However, I was fascinated by how long the delay needed to be to feel ‘instant’ and yet not appear by accident.
 
For me about 150 ms is not noticeable as a delay, whereas 200ms I can start to notice – not an annoying delay, but a very slight sense of lack of responsiveness.