Report: dvi (24K), pdf (200K), ps (88K).
Transparencies (1 per page): dvi (22K), pdf (103K), ps (77K).
Transparencies (4 per page): pdf (212K).
Complexity theory usually considers the worst-case behaviour of deterministic algorithms, but it can also consider average-case behaviour if it is assumed that the input data is drawn randomly from a given distribution. Rabin popularised the idea of "probabilistic" algorithms, where randomness is incorporated into the algorithm instead of being assumed in the input data. Yao showed that there is a close connection between the complexity of probabilistic algorithms and the average-case complexity of deterministic algorithms.
We give examples of the uses of randomness in computation, discuss the contributions of Rabin, Yao and others, and mention some open questions.
Go to next publication
Return to Richard Brent's index page