Hierarchical grammars for more human-like compiler parsing

Nearly twenty years ago, back when I was in York, one of my student project suggestions was to try to make compiler parsers operate a little more like a human: scanning first for high-level structures like brackets and blocks and only moving on to finer level features later.  If I recall there were several reasons for this, including connections with ‘dynamic pointers’1, but most important to help error reporting, especially in cases of mismatched brackets or missing ‘;’ from line ends … still a big problem.

Looking back I can see that one MEng student considered it, but in the end didn’t do it, so it lay amongst that great pile of “things to do one day” and discuss occasionally over tea or beer. I recall too looking at grammar-to-grammar parsers … I guess now-a-days I might imagine using XSLT!

Today, 18 years on, while scanning David Unger’s publications I discover that he actually did this in the Java parser at Sun2.  I don’t know if this is actually used in the current Java implementations.  Their reasons for looking at the issue  were to do with making the parser easier to maintain, so it may actually be that this is being done under the hood, but the benefits for the Java programmer not being realised.

While I was originally thinking about programming languages, I have more recently found myself using the general methods in anger when doing data cleaning as often one approaches this in a pipeline fashion, creating elements of structure along the way that are picked up by future parsing/cleaning steps.

To my knowledge there are no general purpose tools for doing this.  So, if anyone is looking for a little project, here is my own original project suggestion from 1993 …

Background
When compilers parse a computer program, they usually proceed in a sequential, left-to-right fashion. The computational requirement of limited lookahead means that the syntax of programming languages must usually be close to LL(1) or LR(1). Human readers use a very different strategy. They scan the text for significant features, building up an understanding of the text in a more top down fashion. The human reader thus looks at the syntax at multiple levels and we can think of this as a hierarchical grammar.

Objective
The purpose of this project is to build a parser based more closely on this human parsing strategy. The target language could be Pascal or C (ADA is probably a little complex!). The parser will operate in two or more passes. The first pass would identify the block structure, for example, in C this would be based on matching various brackets and delimiters `{};,()’. This would yield a partially sequential, partially tree-like structure. Mismatched brackets could be detected at this stage, avoiding the normally confusing error messages generated by this common error. Subsequent passes would `parse’ this tree eventually obtaining a standard syntax tree.

Options
Depending on progress, the project can develop in various ways. One option is to use the more human-like parsing to improve error reporting, for example, the first pass could identify the likely sites for where brackets have been missed by analysing the indentation structure of the program. Another option would be to build a YACC-like tool to assist in the production of multi-level parsers.

Reading

1.  S. P. Robertson, E. F. Davis, K. Okabe and D. Fitz-Randolf, “Program comprehension beyond the line”3, pp. 959-963 in Proceedings of Interact’90, North-Holland, 1990.
2.  Recommended reading from compiler construction course
3.  YACC manual from UNIX manual set.

  1. For more on Dynamic Pointers see my first book “Formal Methods for Interactive Systems“, a CSCW journal paper “Dynamic pointers and threads“[back]
  2. Modular parser architecture with mini parsers. D M Ungar, US Patent 7,089,541, 2006[back]
  3. Incidentally, “Program comprehension beyond the line” is a fantastic paper both for its results and also methodologically. In the days when eye-tracking was still pretty complex (maybe still now!), they wanted to study program comprehension, so instead of following eye gaze, they forced experimental subjects to physically scroll through code using a  single-line browser.  [back]

the real tragedy of the commons

I’ve just been reviewing a paper that mentions the “tragedy of the commons”1  and whenever I read or hear the phrase I feel the hackles on the back of my neck rise.

Of course the real tragedy of the commons was not free-riding and depletion by common use, but the rape of the land under mass eviction or enclosure movements when they ceased to be commons.  The real tragedy of “the tragedy of the commons” as a catch phrase is that it is often used to promote the very same practices of centralisation.  Where common land has survived today, just as in the time before enclosures and clearances, it is still managed in a collaborative way both for the people now and the for the sake of future generations.  Indeed on Tiree, where I live, there are large tracts of common grazing land managed in just such a way.

It is good to see that the Wikipedia article of “Tragedy of the Commons” does give a rounded view on the topic including reference to an historical and political critique by “Ian Angus”2

The paper I was reading was not alone in uncritically using the phrase.  Indeed in “A Framework for Web Science”3 we read:

In a decentralised and growing Web, where there are no “owners” as such, can we be sure that decisions that make sense for an individual do not damage the interests of users as a whole? Such a situation, known as the ‘tragedy of the commons’, happens in many social systems that eschew property rights and centralised institutions once the number of users becomes too large to coordinate using peer pressure and moral principles.

In fact I do have some sympathy with this as the web involves a vast number of physically dispersed users who are perhaps “too large to coordinate using peer pressure and moral principles”.  However, what is strange is that the web has raised so many modern counter examples to the tragedy of the commons, not least Wikipedia itself.  In many open source projects people work as effectively a form of gift economy, where, if there is any reward, it is in the form of community or individual respect.

Clearly, there are examples in the world today where many individual decisions (often for short term gain) lead to larger scale collective loss.  This is most clearly evident in the environment, but also the recent banking crisis, which was fuelled by the desire for large mortgages and general debt-led lives.  However, these are exactly the opposite of the values surrounding traditional common goods.

It may be that the problem is not so much that large numbers of people dilute social and moral pressure, but that the impact of our actions becomes too diffuse to be able to appreciate when we make our individual life choices.  The counter-culture of many parts of the web may reflect, in part, the way in which aspects of the web can make the impact of small individual actions more clear to the individual and more accountable to others.

  1. Garrett Hardin, “The Tragedy of the Commons”, Science, Vol. 162, No. 3859 (December 13, 1968), pp. 1243-1248. … and here is the danger of citation counting as a quality metric, I am citing it because I disagree with it![back]
  2. Ian Angus. The Myth of the Tragedy of the Commons. Socialist Voice, August 24, 2008[back]
  3. Berners-Lee, T., Hall, W., Hendler, J. A., O’Hara, K., Shadbolt, N. and Weitzner, D. J. (2006) A Framework for Web Science. Foundations and Trends in Web Science, 1 (1). pp. 1-130.  http://eprints.ecs.soton.ac.uk/13347/[back]