In the classical occupancy problem one puts balls in n boxes, and each ball is independently assigned to any fixed box with probability 1/ n . It is well known that, if we consider the random number T_n of balls required to have all the n boxes lled with at least one ball, the sequence (T_n/ n log n ) converges to 1 in probability. Here we present the large deviation principle associated to this convergence. We also discuss the use of the Gartner Ellis Theorem for the proof of some parts of this large deviation principle.
On The asymptotic behaviour of a sequences of random variables of interest in the classical occupancy problem
GIULIANO, RITA;
In corso di stampa
Abstract
In the classical occupancy problem one puts balls in n boxes, and each ball is independently assigned to any fixed box with probability 1/ n . It is well known that, if we consider the random number T_n of balls required to have all the n boxes lled with at least one ball, the sequence (T_n/ n log n ) converges to 1 in probability. Here we present the large deviation principle associated to this convergence. We also discuss the use of the Gartner Ellis Theorem for the proof of some parts of this large deviation principle.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.