Formulaic Thinking, Cargo Cults, and “Science” (in its varied guises)

I had earlier wondered if the Cargo Cult can be broken–or, perhaps, even “should be” broken–if the cargo keeps flowing.  The situation is analogous, in a sense, to why there are neither true believers nor atheists in the fox hole, as the saying goes.  Soldiers and sailors do not necessarily know why or how bad or good things happen.  They believe that the world around them is complex enough that no simple “theory” is good enough and that they lack both the time and the wherewithal to come up with sufficiently “good” theories, if it is at all possible.  They also reside in a world where the good and the bad are, literally, a matter of life or death.  They are not sufficiently invested in any theory being right or wrong to risk their life and limb just to learn a bit more of the “the truth.”  So they are superstitious, not necessarily because they don’t know the “science,” (if anything, they are far more aware of the nuances and the “variance” thereof) but because they are not invested in proving any theory right or wrong.  Notice that this logic applies to the “anti-science” as well as to science.  They may take “lucky charms” of various kinds seriously enough, but they don’t trust them so much that they are willing to risk their safety on the chance that the charm is indeed so “lucky.”  Thus, they are just superstitious enough to believe in all manner of totems, but they are not so superstitious as to “trust” them.

In most walks of life, even if the stakes are not nearly as high as those facing soldiers and sailors, the same attitude prevails:  life may be complex in totality, but abiding by simple rules, accepted on the premise that everyday things are the way things usually are and “should be,” is usually good enough.  Following formulas keeps you on the safe side most of the time, while keeping you away from undue risk and headaches because the world does not change so drastically often.  Thus, people are creatures of habit, inherently “conservative” in their worldview, usually unwilling to change their minds quickly without a good reason–but not so wedded to their worldview that they are unwilling to change what they think even in face of a “good enough reason,” without attendant risks.  So even socially conservatively minded people, as long as their contact with transgenders is limited and have no reason to be biased against them, might be willing to rethink their opinion if asked nicely, for example.  The caveat “without attendant risks,” however, looms large here:  can the same approach be used to change people’s opinion about guns?  About Muslims?  Heck, even about a lot of race-related questions?  Transgenders, as a group, are simply “a bit odd” in the minds of many–even those who are predisposed to oppose their way of life.  They lack “good enough reason” to oppose them.  Hostility to guns and Muslims, however, belongs to a different plane.  The beliefs may or may not be justified on factual grounds, but there is a widespread perception of physical danger and direct harm that they pose, even with a small probability.  There is a “good reason” that people may persist in their belief in face of attempts to convince them otherwise.  For comparison, one might say that it is easy for a scientist to convince a sailor to start using compass (or, perhaps, to start wearing a red shirt, if the former can convince the latter, rightly or wrongly, the red shirt helps him stay safe) but not to convince him to stop carrying a parrot, if the sailor is convinced that the parrot keeps him away from shoals.  Starting to use the compass (or wear a red shirt) seemingly imposes little cost but promises a chance of potentially large benefits.  It’s a lottery ticket worth buying.  But people will not take up what they consider a big risk without due compensation, at the very least.  Taking away a sailor’s parrot offers him nothing.

To elaborate further on the point I was raising the other day, then, “scientific progress” is an inherently risky process.  “Science” demands that those who have been following a well-established set of routines to stop following them and start introducing variations, just to see what happens.  But every routine is characterized by a belief that it “works,” that following it brings considerable benefits and that not doing so is quite costly.  If the Aztecs reap the benefits of sunlight–huge, obviously–in return for sacrificing conquered subjects, which, for Aztecs is very easy, thanks to their warlike nature, why would they want to risk the world without sunlight for the trivial gain like making nice with the pathetic Tlaxcalans?  Existing mindset–the “culture” or “affect,” depending on whether you were trained in anthropology or political psychology–shapes how people value the consequences of the roads not taken, of the routines being broken.  They are never wrong because there is no evidence to say otherwise, because those paths are not taken.  All data comes from the paths that were taken, and naturally, offer justification for the the broad status quo, except perhaps for incremental “improvements” that may or may not be justified–perhaps cutting off Tlaxcalans heads before cutting out their hearts would make the sun rise faster, or not….  If the Tlaxcalans were not so easy to capture for human sacrifice, maybe things would be different, or not.  After all, ensuring that the sun keeps rising is a hugely important thing.  How do you know if the sun will rise again without the blood flowing?  That might be too huge a risk to take…especially since there is, by definition, NO evidence whatsoever to back it up.  The only thing that enables this leap to be taken is, literally, one of faith, justified by no evidence but a set of contrarian beliefs, as per Kierkegaard’s argument (I think I’m linking to the right book…)..

This leads to a curious paradox, in which “science” and “technology” often wind up being at odds with each other.  While “technology” might depend on “scientific” understanding, it rests on the acceptance of the status quo and the need for incremental improvement of the formulas.  It does not question the validity of everyday things or raise awkward question.  It simply says, yes, the formulas are inherently right, but we can add this one tweak and we can do better.  This was literally being done, to keep up the Ptolemaic astronomy in the Middle Ages:  an extensive system of “tweaks,” in form of epicycles, were added to the basic Ptolemaic formula to keep the basic structure intact.  The skepticism undergirding “science” does not, however, accept the status quo as given.  The formulas are not “inherently right,” but only provisionally so.  To learn where and when the formulas are not, some crazy risks, potentially with big repercussions, need to be taken contrary to things that “everyone knows” to be “obviously” true.   What’s worse, the skepticism yields no obvious short term benefits:  while Copernican theory made the calculations vastly easier, by reducing the number of epicycles that were required, the basic structure was still fundamentally “wrong” in both logical and empirical sense.  It took centuries of additional refinement to get to the classical physics as we understand it, and, other than the computational ease, there was no “good” reason to take Copernicus seriously when his book was published.  As long as the argument was not offensive, however, there was no good reason to overtly “reject” it, much the way social conservatives who partook in the Brockman/Kalla study found little reason to persist in their hostility towards transgenders.  Indeed, without the controversies wrapped up in the Reformation and papal politics, Copernican science might have won over by osmosis anyways.  But, politics happen and Galileo was a prickly and arrogant blowhard who stepped on many toes.

This paradox flies in the face of the popular understanding of “science.”  What people take to be “science” is in fact technology.  Yet, with enough epicycles, you can make creationism compatible with vaccines, oil deposits, and even fossils, at least for the common audience.  From the perspective of the sailor, the question becomes why he can’t keep both the parrot and the compass, and the argument against the parrot is not particularly convincing, given the potential “risks” involved.  The truth is that there is precious little argument, at least in the short term, for “science.”  “Science” will not make us happier or wealthier.  It may not even make us “wiser” until much later than we’d like. So why should we give up our formulas for them, especially if its practitioners are being a collective ass?  Given the proclivity of the social sciences to butt in on controversies of the day, coupled with the far larger uncertainties inherent in topics of research among social sciences, this is especially a pertinent question for them.

Much the same argument prevails in the public policy realm as well:  people are willing to partake in, essentially, a superstitious activity in face of what appears to be a very real risk–against immigration, Muslims, EU, etc.  Are these irrational and foolish?  Perhaps.  But what assurances can you offer against the perceived risks, other than ridicule those who fear them for fearing them in the first place and call them names?  That can only ensure that the argument against fear, already imperiled because of the very real presence of the fear–even if the object of that fear may not be as real as it is deemed–will be rejected with certainty:  not only are people afraid, they are forced to deal with those who are at best uncaring, callous, and oblivious, and at worst, actively seeking to prey on them. If people are behaving formulaically, they often do so for a consistent, even if not always logical, set of reasons.  They can be approached by better understanding where their formulas come from and what sustains them–although success may not always be guaranteed, as per Aztecs and human sacrifice.  It is foolish to believe that they can be simply supplanted by hectoring and ridiculing them.  (Ironically, of course, the same argument would apply to those on the opposite side–as much as Sanders and Trump supporters, in their respective camps, are “odd” and subscribe to formulas that seem “strange,” the supporters of the conventional wisdom also subscribe to various formulas that do not always have a logical underpinning other than they “work” empirically–see this essay for a further exposition.)  Perhaps, if they cannot be dealt with peacefully, they can be consquistadored, like the Aztecs, and converted at the sword point.  But that too is a serious undertaking.  No matter what the recourse, this is a challenge that needs to be taken seriously, which very few seem eager to partake in.

Advertisements

2 thoughts on “Formulaic Thinking, Cargo Cults, and “Science” (in its varied guises)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s