I've always loved the idea of coin flips, and hence this paean to a few of their properties.
One of the most basic tools in the study of probability and its applications, coin flips reflect or model many profound dichotomies, ranging from yin-yang and win-lose to yes-no and 0-1.
A more technical property of coin flips is, however, more useful. They are independent in the sense that the outcome of one flip has no influence on the outcome of any other. When two (or more) events are independent in this sense, the probability that they both (or all) occur is computed by multiplying the probabilities of the individual events.
Thus, for example, the probability of obtaining two heads in two flips of a fair coin is 1/2 x 1/2 = 1/4.
Of the four equally likely possibilities (first flip, second flip) -- T-T; T-H; H-T; H-H - only one of the outcomes is two heads.
For the same reason the probability of obtaining two heads followed by a tail and then another head (H-H-T-H) is 1/2 x 1/2 x 1/2 x 1/2 = 1/16, and the probability of five straight flips of a fair coin resulting in heads (H-H-H-H-H) is 1/2 x 1/2 x 1/2 x 1/2 x 1/2 = 1/32.
Alas, people sometimes use crooked or biased coins, either knowingly or in an attempt to cheat others.
To obtain a fair result from a biased coin, the mathematician John von Neumann devised the following trick. He advised the two parties involved to flip the coin twice. If it comes up heads both times or tails both times, they are to flip the coin two more times.
If it comes up H-T, the first party will be declared the winner, while if it comes up T-H, the second party is declared the winner. The probabilities of both these latter events (H-T and T-H) are the same because the coin flips are independent even if the coin is biased.
For example, if the coin lands heads 70 percent of the time and tails 30 percent of the time, an H-T sequence has probability .7 x .3 = .21 while a T-H sequence has probability .3 x .7 = .21. So 21 percent of the time the first party wins, 21 percent of the time the second party wins, and the other 58 percent of the time when H-H or T-T comes up, the coin is flipped two more times.
In this way both parties can be confident they have an equal chance of winning even if the coin is biased.
There is an alternative approach that also works if you're unsure of the fairness of the coin your opponent has brought and you're allowed to call the flip. If this is the case, before you call the flip, flip your own fair coin and guess that your opponent's coin will land the same way.
Even if your opponent's coin lands heads 90 percent of the time and tails 10 percent of the time, you'll still win the flip 50 percent of the time since you're just as likely to pick the biased coin's good side as its bad side. (50 percent of 90 percent = 45 percent, and 50 percent of 10 percent = 5 percent, so your chances of winning are 45 percent + 5 percent or 50 percent).
An argument like this can be used to provide a partial defense of insider trading and stock price manipulation since the outside investor is as likely to benefit therefrom as be hurt.
From now on, I'll assume that the coin is a fair one. It's easy to use such a coin to obtain probabilities of the form m/2^n (1/4, 7/8, 5/16, etcetera) by flipping a coin n times. But how can we obtain a probability of exactly 1/3. Von Neumann's trick can be modified to give us an answer.
Flip the fair coin twice. As mentioned, the four equally likely outcomes are H-H, H-T, T-H, T-T, where once again the order indicates the order of the two outcomes. Count H-H as a success, H-T and T-H as failures, and T-T as calling for another two flips. In this way one out of the three outcomes counts as a success, so the probability of success is 1/3, and the probability of failure is 2/3. Comparable tricks can be devised to yield any probability m/n we might want.
Sequences of coin flips can also be used to illustrate many ideas from statistics, but these last two puzzles about sequences are less well-known. Imagine that you continue to flip a fair coin and generate a sequence of Hs and Ts.
The problem is that you must bet on whether T-H is more, less, or equally likely to come up before H-H in the sequence of coin flips. Note in the particular string H-T-H-T-T-H-H-T-T-..., that T-H comes up after 3 flips and H-H after 7 flips.
A little thought makes clear that T-H will usually come before H-H since the only way H-H can come first is for the first two flips to be H-H. This occurs with probability 1/2 x 1/2 = 1/4. If this doesn't happen, the next H must be preceded by an T and so T-H will come up first. Thus T-H comes up before H-H 3/4 of the time.
Finally, a couple of much trickier questions of the same type for math aficionados: Continue with a string of fair coin flips. Which sequence is likely to come up first: T-H-T or T-H-H? How many times, on average, must you flip a coin before T-H-T comes up. And how many times, on average, must you flip a coin before T-H-H comes up? The answers are below.
Although coin flips may appear abstract and simplistic, they can be used to clarify almost any scenario in which probability plays a role. One can do worse than adopt a flip view of life.
Answers: T-H-T and T-H-H are equally likely to come first in a sequence of fair coin flips (they both begin with a T-H). Nevertheless, the average number of flips before T-H-H comes up is 8, whereas the average number of flips before T-H-T comes up is 10. (The counterintuitive result may be found by using conditional expectations and deriving a recursion formula for them or by simulation.)
John Allen Paulos, a professor of mathematics at Temple University in Philadelphia, is the author of the best-sellers "Innumeracy" and "A Mathematician Reads the Newspaper," as well as, most recently, "Irreligion." He's on Twitter and his "Who's Counting?" column on ABCNews.com usually appears the first weekend of every month.