par Louchard, Guy ;Randrianarimanana, B.;Schott, René
Référence Lecture notes in computer science, 372 LNCS, page (521-533)
Publication Publié, 1989
Article révisé par les pairs
Résumé : By dynamic algorithms, we mean algorithms that operate on dynamically varying data structures (dictionaries, priority queues, linear lists) subject to insertions I, deletions D, positive (resp. negative) queries Q+ (resp. Q−). Let us remember that dictionaries are implementable by unsorted or sorted lists, binary search trees, priority queues by sorted lists, binary search trees, binary tournaments, pagodas, binomial queues and linear lists by sorted or unsorted lists etc. At this point the following question is very natural in computer science: for a given data structure which representation is the most efficient? In comparing the space or time costs of two data organizations A and B for the same operations, we cannot merely compare the costs of individual operations for data of given sizes: A may be better than B on some data, and conversely on others. A reasonable way to measure the efficiency of a data organization is to consider sequences of operations on the structure. J. Françon [6], [7] and D.E. Knuth [12] discovered that the number of possibilities for the i-th insertion or negative query is equal to i but that for deletions and positive queries this number depends of the size of the data structure. Answering the questions raised in [6], [7] and [12] is the main object of this paper. More precisely we show:i)how to compute explicitely the average costs,ii)how to obtain variance estimates,iii)that the costs converge as n→∞ to random variables either Gaussian or depending on Brownian Excursions functionals (the limiting distributions are therefore completely described). At our knowledge such a complete analysis has never been done before for dynamic algorithms in Knuth's model.