A paradox that was the source of much correspondence among eighteenth-century mathematicians. Originally posed by Nicolaus Bernoulli to Montmort, it became well known as a result of the solution given by Daniel Bernoulli in 1738 in the journal of the St Petersburg Academy.
Bernoulli’s scenario was essentially as follows. Two players, A and B, play the following game. Player A repeatedly tosses a coin, stopping when a head is obtained. If A has to toss the coin k times, then A pays £2k to B. Bernoulli’s question is ‘How much should B pay A in order to make the game fair?’ The answer is that B must pay to A the average amount that A pays B. Half the time (assuming a fair coin) this will be £2. Half the remaining time it will be £4, and so on. However,
Since the required number of tosses has no upper limit, this sum is infinite. Thus, for this game to be fair, B must pay A an infinite amount of money, even though B is certain to receive only a finite amount of money in exchange (and less than £10 on 87.5% of occasions).