OK, I didn't know the fact that the stationary distribution of a M/G/inf queue is the same Poisson as of a M/M/inf queue whose arrival process and mean serving time are the same. In our model, triples of players are 'customers', tournaments are 'servers' and the 'serving time' is the tourney duration time, which has a 'general' distribution G, which, btw, doesn't look like any 'standard' distribution.
So now we know that the number of running tourneys at a given moment is Poisson distributed with mean L [I'm too lazy ot write 'lambda' all the time], and the distribution of the highest running multiplier follows a clear parametrical model.
E.g. the conditional probability of the biggest JP tourney running being
not 2500x, under the condition that k tourneys are running currently, is 0.99999^k (the frequency of 2500x's is 1/100000), hence the total probability of this is e^{-L} * \sum ((0.99999*L)^k/k!) = e^{-0.00001*L}.
Analogously, the probability of a single tourney being neither a 2500x nor a 200x is 0.99994, hence the probability of the biggest tourney being 100x or smaller is e^{-0.00006*L}. And so on.
L can now be estimated using the maximum likelihood (or minimum chi-square method)
(we only need to ensure that the observations are 'independent', i.e. wait for some time before making the next observation so that the current top tourney finishes).
Heh, it's been a while since I last did stats at the uni level of complexity
Last edited by coon74; 05-28-2015 at 01:10 PM.