There is a lot to chew over in this essay and a much older essay that it links to. I do wonder if the authors of these posts are themselves a bit trapped in their own (somewhat self-congratulatory, I’m afraid) bubbles of their own.
One book that went far to shape my thinking about the epistemology of science, even before I read Popper and Feyerand, was The Nemesis Affair, by David Raup. Raup, who passed away in 2015, was a notable paleontologist who contributed to the idea that mass extinctions are cyclical and may have extraterrestrial origins–e.g. comets. He began the book by going over theories like his that came before, almost invariably posed by great scientists with vast knowledge across multiple fields, including no small amount of expertise in paleontology and astronomy, but not professional astronomers or paleontologists–people like Harold Urey, the Nobel prize winning chemist. Their arguments invariably made their way to top journals like Nature, journals that many scientists pay attention to, only to be met by total silence. The reason, Raup suggested, was that the arguments of the sort that people like Urey were selling were just so far outside the conventions of the field that they were addressing that nobody knew how to respond, but, unlike some no namers who cannot be safely brushed aside with the snide attribution that they say such things because they don’t know better, famous and accomplished scientists cannot be so easily dismissed as cranks. So they get their hearing, the polite applause, and a publication in Nature, then everyone goes around around as if the whole thing never happened–because, for all practical purposes, it never did as their argument cannot be placed in context.
Could great scientists be the only ones who saw puzzling clues like what motivated his publication about comets causing extinctions? It is doubtful: far more likely, the younger, less accomplished scientists who thought up such crazy ideas were told that, if they press further, they will simply ruin their reputations and not get tenure. Once they get tenure but settle down into being a routine scientist of middling sort, it is far easier to simply take conventional wisdom for what it is and live their lives. It takes both a great scientist and a madman, someone who enjoys such prestige and influence that they cannot be brushed aside so easily, to obnoxiously push forward new ideas. Of course, history reminds us that Galileo was such a person, forgotten though it is amidst all the mythmaking about his persecution–he was a friend of the Pope and half the cardinals who were presiding over his trial, was treated like an honored guest when he was being “tried,” and his sentence was to live outside the city limits at a luxurious mansion of his friend and supporter for a while. Hardly “persecution.” The rest of us have to conform to the conventions, if we value our lives, and quite frankly, “That’s something I should comment on. Nah, what’s the point? Too much downside” is the rule that all of us live by most of the time.
The trouble with this, of course, is that this creates an echo chamber of sorts–not necessarily one where everyone repeats the same thing and believes the same thing, but one where everyone knows what “the truth” is, repeats the same sanctimonious things, and keeps to themselves. Societies like this are not uncommon: USSR was like this: everyone knew what the official Truth was–that’s literally what it says on the label on Pravda (the Truth, literally). So everyone repeated it, acted like they took it seriously, and nobody believed a word of it, whether it was true or not. Without means of evaluating the “truth” to satisfaction, everyone was essentially entitled to their private truths–whatever they believed was “really” going on in the world. But this is not just true of an authoritarian society: every society has certain myths that are “true” just because, that one cannot question. To question these “truths,” indeed, is to expose oneself as an outside who cannot be trusted–say, a Korean who questions some of the national myths about horrors of Japanese rule, or an American who does not believe in Russians hacking the 2016 elections. It is not so much that these official truths are false: in fact, I’d imagine that, on average, vast majority of the content on Pravda was in fact very true factually throughout the entire Soviet era. It is simply that overt questioning of the myths is not permitted.
The problem goes farther than that: the truth is wrapped in layers of uncertainties, while the definitions that we use are poorly defined. Can we even handle the truth when we see it? As the saying might go, if we see God face to face, would we recognize even Him? How people lie with statistics is at the margins, assumptions, and definitions: the important thing, when dealing with data, is not whether something is or is not “true,” but whether the estimates are within an accepted set of margins of error given certain definitions about how things work. What makes physicists, in particular, so much better at these than most other people is that they are very good at precisely crafting these definitions and assumptions and thinking through them logically. But when the universe is itself murky, these clear definitions are self-deceiving: as per my ever persistent rant about DW-Nominate: yes, the numbers would indicate the “ideology” if the ideology were spatial and people acted both geometrically and asocially (i.e. based only on their own “preferences” without politics), but those would be some pretty damn stupid assumptions to make when you are dealing with politics. If one is a physicist, the proper course of action as a scientist would be to conduct experiment in a setting where nuisances like air resistance or friction do not exist–or can be minimized, not pretend that universe everywhere is frictionless and pretend that the models that assume away friction provide usable guidance. If one is an engineer, the theoretical models would be taken with a big grain of salt, consisting of a bunch of formulas and tables that account for frictions and such things for practical purposes. By not being able to question sacred cows of assumptions that may not be challenged, we can do neither. (I had the good fortune of just rereading this essay by Freeman Dyson about his experience crunching numbers for the RAF Bomber Command during World War II. Basically, you can get people to trust you not just because your numbers are good–they wouldn’t know it even if they saw them: they are NOT self evident, especially in a world where uncertainty is high–but because you are a famous scientist and, more importantly, decorated navy officer from World War I. If you are neither, they trust you only so far as your “information” confirms their existing beliefs, rightly or wrongly. Dyson has a wonderful description for this: if the former, you are giving “advice”; if the latter, you can only give “information.”)
In a sense, this is the fundamental problem: even if what you are saying is true–and, you yourself don’t always know this–there is no guarantee that your interlocutor will recognize it as true. They have a certain set of ideas about what the “truth” should look like and if what you say does not look like it, you’d better give them reasons why your truth is bigger than their truth. Not easy if they outrank you and tell you to “shut up.” Feynman, in his famous Cargo Cult essay, had this to say about this:
“We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.
Why didn’t they discover the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number close to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that …”
Jessica Livingston is right: when there is silence, we do lose in insights, as per the aftermath of the Millikan oil drop experiment. But we also know that every new idea we have is potentially mad, and we have much to suffer if we are perceived to be mad. Agreeing with the “right people” that their worldview is right, and only minimal changes are necessary, if any, is something we do all the time. Of course, this pollutes the information provided: some of the information says “I am your friend and I support you, whether you are right or wrong.” Only a little bit says, “I think you are wrong.” In a highly uncertain environment where the right and the wrong is not obvious, even on the mattes of facts, it’s better, easier, and safer, not to mention more rewarding in career, to be on the side of the conventional wisdom, or what the present important people have to say about the universe. Since the presently important people are usually not stupid, they are probably right anyways and you probably did not make an important, earth-shaking discovery. But, if they are wrong, they can be very wrong, and if everyone is trying to be friends of the powerful rather than tell the truth Since we can only tell the truth secondarily, from the analyses by these people who crunch the numbers and NOT from our own analyses of the truth–remember, we can’t handle the truth, literally, at least not all of it, so we almost always have to learn about the universe secondarily–when everyone who crunches numbers is interested more in appeasing the powerful and important rather than raise questions, we don’t even know how big a mistake we are making. (In retrospect especially, I think everyone knew that there was something fundamentally wrong with the Clinton candidacy, and there was so much wrong with it that everyone saw something different. But everyone knew that she had to win because the alternative did not make any sense, so they all minimized their sense of how likely Clinton defeat was, until only the truly mad expected Trump victory. I think that’s a worse outcome than just hedging bets–it helped validate the true nutjobs as if they are only sane people.)
If Livingston is only discovering it now, I think she has lived a charmed life that most of us don’t have the luxury of.