Alan Dix - research
topics - randomness
randomness in computing
Traditional algorithms are deterministic, attempting to find the unique or the best solution. In contrast, 'modern' algorithmics (modern here really goes back at least 30 years!), including neural networks, genetic algorithms and simulated annealing, makes heavy use of randomness. These algorithms are non-deterministic and find 'a solution' rather than 'the solution' and 'good' rather than 'best'. Because of this more relaxed and inexact approach to solutions, these algorithms can tackle problems that are otherwise intractable including NP-hard ones. Quality is traded for computation. In some cases, this is a simple cost-benefit trade-off, in others this is because the computation for the exact solution would be impossible.
Examples of these algorithms include:
- optimisation/search - simulated annealing, genetic algorithms, neural nets all involve some degree of randomness which enables solutions to NP problems. This works because there are many reasonably good solutions albeit in a complex space. Finding pretty good solutions is therefore acceptable. Furthermore in some of these areas, the end result is machine learning, finding a set of rules based on training data to apply to future unseen examples. In these cases, the 'optimal' solution for the training data would actually be a poor solution as it would be over-trained. Sub-optimal, but pretty good solutions may well be better than optimal ones!
- cryptography - keys need to be random so that they cannot be guessed by a third party. Often keys need to be prime numbers or some other prime number has to be chosen as part of the cryptographic algorithm. These prime numbers may be hundreds of bits long and so large that known deterministic algorithms would take times in the order of the age of the universe! The solution is to randomly generate numbers and test them to see if they are prime. However, primality tests themselves would take far to long, so approximate tests are used which rely on yet more random numbers being used as probes [Schneier 1996]. Digital watermarking also makes use of spreadspectrum techniques (see below) [Dugelay 2001].
- signal processing - adding white noise can, under certain circumstances, increase the sensitivity of analogue to digital circuitry!
- wireless communication - spreadspectrum encoding techniques make use of random shifts between frequencies. In particular Code Division Multiple Access (CDMA) modulates a signal by multiplying it against a pseudo-random signal - this allows the same frequency band to be shared by multiple transmitters and also protects against certain types of noise, jamming and eavesdropping. [CDMA 2000]
- telephone routing - some years ago the US phone network suffered a massive, albeit temporary, collapse. The telephone routing software sought lowest cost routes. Once a large central exchange broke down and calls that would have normally been routed through it were diverted to the next lowest cost route. Because this was deterministic, large numbers of calls hit the same exchange on the second lowest cost route. It was unable to cope with the high level of calls and subsequently broke down. The same thing happened on the next lowest cost route and so on until the whole system collapsed. The routing algorithms have been modified to allow some sub-optimal routing and randomly route calls through 'cheap enough' paths. This will avoid a future catastrophic collapse.
- parallel computation - by randomising placement of computation on processors, it is possible to avoid computations which would hit worst case behaviour of parallel algorithms [Raman 1998]. Also, random routing between multiple processors and memory elements is used to avoid contention problems at routers, rather like the telephone case.
In some of these cases randomness is used simply to reduce the computational
effort - if one could do the calculation in full it would be better but the
random version is just 'good enough'. However in many cases, the randomness
is essential otherwise the system would be worse or incorrect.
References
[CDMA 2000] CDMA Develpment Group. What is CDMA (Code Division Multiple Access)? (accessed 17th Nov 2001, dated © 2000) [http://www.cdg.org/technology/]
[Dugelay 2001] Jean-Luc Dugelay and Fabien A. P. Petitcolas. Digital watermarking (tutorial). SAICSIT 2001, South African Institute of Computer Scientists and Information Technologists Annual Conference, University of South Africa, Pretoria, September 2001. pp.25-28. [abstract]
[Dugelay 1999] Jean-Luc Dugelay and Stéphane RocheChapter 6: A survey of current watermarking techniques. in Information hiding techniques for steganography and digital watermarking. Stefan Katzenbeisser and Fabien A. P. Petitcolas (eds.). Artech House Books, 1999. ISBN 1-58053-035-4 [amazon.com/.co.uk]
[Raman 1998] R. Raman. Random Sampling Techniques in Parallel Computation. Proceedings of the Third Workshop on
Randomized Parallel Computing. IPDPS/SPDP Workshops 1998, pp.351–360. [PDF]
[Schneier 1996] B. Schneier. Applied Cryptography second edition. Wiley, 1996. ISBN 0471128457 [amazon.com/.co.uk]