In a highly-regarded book, Duncan Watts generalizes something that has been well known in many fields for generations: once we (think we) know the “right answer,” the alternatives seem so obviously wrong that how anybody could have reasonably thought they could plausibly have been true suddenly makes no sense to us, even if we ourselves might have thought the answer less than obvious until just recently. A great paradox in this line of thinking is that, even if we might think the (allegedly right) answer is obvious because we (think) we know it is the right answer, we often don’t know how it is the right answer. Indeed, education is far more effective when the focus is on the wrong answers–and how we know those answers are wrong. This should hardly be a surprise to Karl Popper: it is, after all, how he conceptualized the workings of “science.”
But this is not how “science” works to the lay audiences. “Science,” to many, is smart people who know the “right answers” telling us why their answers are right. There is something inherently cultish and quasi-religious about this and some people are seizing upon this as problematic–and that may be a good thing in general. People simply don’t know the workings of science: they just “know” that Lavoisier was right and Mesmer was wrong for whatever reason they think Lavoisier “looks” like the smarter person in retrospect. They’d be lucky to know what they said that were right or wrong, let alone be able to explain why and how they were right and wrong. (The truth would be rather complicated–while Mesmer’s ideas were definitely wrong, they still capture enough of people’s fancy that, even today, there are many who hawk updated versions of his beliefs–disturbingly, they often show up on public television. Lavoisier, while on the right direction, was also generally wrong–not surprising as his science is, after all, badly out of date now.) In absence of the particular knowledge, however, evaluating Lavoisier and Mesmer on their merits is impossible for many, if not almost all people out there. There have been fascinating psychological research suggesting that TED-style lectures, with glitzier production values and smooth talking lecturers, make people feel like they learned things…but not really when they get tested on what the talk was about, something that I can attest to from a great deal of personal experience. The word “con man,” of course, is short for “confidence man”: someone who says X is true with certainty and confidence, even if he is lying, inspires far greater confidence than those who qualify their answers with caveats, conditionalities, and nuances, even if the latter set of characteristics indicate real knowledge. I joke that the economics used to contribute really useful insights for policymaking when economists had multiple arms, but economists had to cut off most of their arms because politicians would only trust one-armed economists and the usefulness of economics to the real world suffered–because just about anyone can play an one-armed economist–it’s all those extra arms, in form of assumptions, qualifications, conditionalities, and caveats, that make economic insights useful, but they also make economics less “believable.” Perhaps it is a good thing in principle: you don’t “believe” a science–if you “believe” a science without understanding the moving parts, you are doing something wrong. But one does not, in the end, sell “science” to the masses, I suppose.