Dagstuhl-Seminar 9527
`Average-Case'-Analysis of Algorithms
( 03. Jul – 07. Jul, 1995 )
Permalink
Organisatoren
- H. Prodinger
- Ph. Flajolet
- R. Kemp
- R. Sedgewick
Kontakt
Analysis of algorithms aims at a precise prediction of the expected performance of algorithms under well-defined randomness models of data. The classical aspects are well covered by Knuth's magnum opus who treated, already more than twenty years ago, many aspects of fundamental algorithms, semi-numerical algorithms, or sorting and searching. In this, and other domains, what is sought is a precise description of the average-case behaviour of algorithms, often a very meaningful practical measure of complexity.
The field is undergoing tangible changes. We see now the emergence of combinatorial and asymptotic methods that permit us to classify data models into broad categories that are susceptible of a unified treatment. This has two important consequences for the analysis of algorithms: it becomes possible to predict average-case behaviour under more complex data models (for instance, nonuniform models and even Markovian dependencies); at the same time it becomes possible to analyse much more structurally complex algorithms since we have a much higher level grasp on the average-case analysis process.
Mathematical analysis of algorithms is also gradually extending to more recent or to less classical areas of computer science. Circuits are an instance, where the depth of a random circuit proves to be much smaller on average than in the worst-case (Diaz). Gate matrix design may even benefit from methods of random graph theory (Karonski). Generating function methods prove operational in the reliability analysis of a cellular network (Sipala). Finally, ideas that stem from the analysis of Shellsort lead to the design of highly regular practical sorting networks that appear to sort almost surely (Sedgewick).
Other new applications of analysis of algorithms concern: patterns in strings like in DNA sequences (Régnier), computing with faulty processors (Louchard), parallel simulations (Greenberg), random and exhaustive generation (Kemp), simulation (Robson), computer algebra (Gonnet), occupancy problems (Gardy) and parallel scheduling (Wright).
This meeting is the second of its kind. The first one (Dagstuhl report #68) has given rise to a healthy 315 pages special issue of the journal Theoretical Computer Science (vol. 144, number 1-2). The present meeting is coupled to a special issue of the journal Random Structures and Algorithms , due to appear in 1997.
- H. Prodinger
- Ph. Flajolet
- R. Kemp
- R. Sedgewick