Why Oh Why Can't We Have a Better Press Corps?
Lloyd Rose doesn't understand probability, but the editors of the Washington Post don't mind. He reviews Chance by Amir D. Aczel:
The author of several other books on science for the layman, Aczel encourages his readers to take a can-do attitude. Cheerfully, he confides, "I want to tell you a secret: measuring probabilities is as simple as counting." Well, sort of. "Chance" contains a number of equations. Happily, the math-impaired can skip these and still glean the general idea.Where to begin? Rose commits the gambler's fallacy (even though he realizes that coin flips are memoryless), claims that every random variable has the same distribution, and repeatedly claims that mathematicians have no idea why all these theorems of theirs are true. Perhaps you don't glean as much as Rose thought if you skip all of the equations.
The general idea of probability theory is illustrated by the now infamous bell curve. Actually a value-neutral form of measurement, the curve demonstrates that if you gather random factors and then graph them, the resulting line will be near-flat on the left, rise gradually to the rounded peak of a hill, then sink at exactly the same gradient to near-flat again on the right, resulting in the shape of a bell. In a graph of the heights of American men, for instance, the central point would convey the largest category (say, men 5-feet-9), while the descending lines on the left and right would testify to the lesser number of men either smaller or taller, with the extremes growing as the line flattens out. The result is that we can know the odds of any American man being taller than 5-feet-9 or shorter than, say, 4-feet-10. An accumulation of random data always generates this curve of probability, as if conjuring it out of the air. No one knows why. It just happens.
At this point, we begin to approach the mystical element of mathematics. Any time you ask a "why" question about math, you can find yourself in unmapped territory.
Why, in simple arithmetic, do one and one always equal two? They just do. Where do we get the idea, removed from any objects, of "twoness"? We don't know. (Plato's theory of transcendent ideas is out of favor.) The individual flip of a coin is a memoryless event -- the second coin flipped doesn't remember whether the last coin came up heads or tails, and the third coin doesn't remember how the second flip turned out. You start fresh every time. And yet, as Aczel illustrates, after about the first 120 tosses, the results begin to come up 50-50 (though not all at once: for example, from roughly 250 to 550 tosses, the coins will mostly land heads-up, a lead that tails will catch up to later). Why does this happen? How does it happen? What's going on here anyway?
First, the gamblers' fallacy. You don't need tails to "catch up" in order to get about 50% heads and 50% tails. Having 10 heads extra may throw your average way off when you have 100 coin flips, but it doesn't matter so much once you have 1000 or 10000 flips, which is why the Law of Large Numbers makes your coin flips end up heads about half the time if you have a large number of flips: see this java applet.
Some variables (like the number that comes up when you roll a die) don't have a normal distribution, but if you average together the numbers that you get when you roll a bunch of dice, and if you have enough dice, then your the average will have a distribution that's close to normal. The same thing happens for the average of almost any random variable on account of the Central Limit Theorem, as these pictures show.
So what's going on? If you want to understand, you can skip mysticism but you have to know something about statistics. The editors of the Washington Post should know that.