This op-ed piece is making an argument, knowingly or not, that repeats an argument that I had often made: the important statistic is not usually the mean, but the variance. A lot of “expert advice,” on the desirability of UK remaining in EU, immigration, and international economic integration in general, is usually sold in terms of medium term impact, on average, while the general public is much more sensitive to the distributional consequences to specific population segments in the short term. Of course, that is also the implicit argument made by Piketty about economic growth and inequality: how the top has been growing much faster than the bottom–indeed, the latter has been stagnant or even falling–last several decades so that the variance in wealth has been growing faster than the mean.
The means fallacy, if it can be called such, is one of the oldest lies that people tell with statistics. Technically, it’s not a lie: all statistics are true, but the means are rarely representative of the real world from which the data is derived. Indeed, there may be ZERO actual data that corresponds to the mean: after some n coin tosses, the mean outcome would be neither head or tail. Good luck finding a coin toss that results in a head or a tail. Yet, people are asked to make judgments solely on comparisons of means all the time: on average, you will have more money with X, so you should “rationally” choose X! The big data movement dovetails nicely with this tendency: we want to get as big a data as possible so that we can predict the mean as precisely as possible. If the mean does not match the data, the data made a mistake. (While I jest, this is a rather surprisingly typical attitude among the data science types–among the DW -Nominate types, prediction errors are usually called “mistakes,” mistakes by the legislators who did not vote the way their scores say they should vote, not the errors by the algorithm that could not account for the actual votes.
Of course, this, in turn, fits neatly into the risk management mentality that I had described in the last post. “democracy” means errors, uncertainty, and chaos. Thus, “democracy” must be subverted and managed to ensure that uncertainty is minimized, by introducing a set of “dictators” to bring in orderliness and “rationality.” Perhaps the problem is that the ideas of “orderliness” and “ratinoality” are themselves too detached from the reality, and that more time should be spent looking at and thinking about reality, rather than trying to parse the data to translate between languages without understanding them. Or, to quote the article, “To argue that problems can be solved without examining how and under what conditions is sheer intellectual laziness.” (I should replace “without examining how and under what conditions” with “without understanding what they are and where they come from.”)