A friend observes:
Listening to the nuclear power plant safety portion of John Hockenberry's NPR show a few minutes ago, I suddenly had a moment of clarity about my own emotional biases. If I'm reading or listening to someone where most or all of what they think or claim has no admitted contradictions, no difficult choices, no acknowledged problems, no legitimate entry for debate or disagreement, nor confession of doubt or uncertainty, I fundamentally mistrust them at a very primal, emotional level, whether or not I know anything else about the issue at hand. The older I get and the more I know and have seen, I distrust even (perhaps especially) my own judgment in inverse proportion to how certain I feel about an issue.
And goes on to say:
Oh, there is a short road from this feeling to crippling self-doubt. It's also a reason that academics in general are shitty communicators who over-qualify everything. This is really more a feeling than an argument — it's a gut thing for me. In a way it makes me easy to manipulate in the other direction — all I need is someone who entertains some doubts and ambiguity and I'm foolishly reassured about what they want or think.
As someone who shares this same impulse — and thus of course sees it as a mark of sophistication — I take it as an adaptation to what Julian Sanchez calls “one way hash arguments.”
Sometimes the arguments are such that the specialists can develop and summarize them to the point that an intelligent layman can evaluate them. But often—and I feel pretty sure here—that’s just not the case. Give me a topic I know fairly intimately, and I can often make a convincing case for absolute horseshit. Convincing, at any rate, to an ordinary educated person with only passing acquaintance with the topic. A specialist would surely see through it, but in an argument between us, the lay observer wouldn’t necessarily be able to tell which of us really had the better case on the basis of the arguments alone — at least not without putting in the time to become something of a specialist himself. Actually, I have a possible advantage here as a peddler of horseshit: I need only worry about what sounds plausible. If my opponent is trying to explain what’s true, he may be constrained to introduce concepts that take a while to explain and are hard to follow, trying the patience (and perhaps wounding the ego) of the audience.
Come to think of it, there’s a certain class of rhetoric I’m going to call the “one way hash” argument. Most modern cryptographic systems in wide use are based on a certain mathematical asymmetry: You can multiply a couple of large prime numbers much (much, much, much, much) more quickly than you can factor the product back into primes. A one-way hash is a kind of “fingerprint” for messages based on the same mathematical idea: It’s really easy to run the algorithm in one direction, but much harder and more time consuming to undo. Certain bad arguments work the same way — skim online debates between biologists and earnest ID afficionados armed with talking points if you want a few examples: The talking point on one side is just complex enough that it’s both intelligible — even somewhat intuitive — to the layman and sounds as though it might qualify as some kind of insight. (If it seems too obvious, perhaps paradoxically, we’ll tend to assume everyone on the other side thought of it themselves and had some good reason to reject it.) The rebuttal, by contrast, may require explaining a whole series of preliminary concepts before it’s really possible to explain why the talking point is wrong. So the setup is “snappy, intuitively appealing argument without obvious problems” vs. “rebuttal I probably don’t have time to read, let alone analyze closely.”
If we don’t sometimes defer to the expert consensus, we’ll systematically tend to go wrong in the face of one-way-hash arguments, at least outside our own necessarily limited domains of knowledge. Indeed, in such cases, trying to evaluate the arguments on their merits will tend to lead to an erroneous conclusion more often than simply trying to gauge the credibility of the various disputants ....
Update: Scott (whom I think is the same Scott Alexander of the excellent deeply problematic Slate Star Codex) calls his response to this epistemic learned helplessness.
I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology rather than the universally reviled crackpots who write books about Venus being a comet.
I guess you could consider this a form of epistemic learned helplessness, where I know any attempt to evaluate the arguments are just going to be a bad idea so I don't even try. If you have a good argument that the Early Bronze Age worked completely differently from the way mainstream historians believe, I just don't want to hear about it. If you insist on telling me anyway, I will nod, say that your argument makes complete sense, and then totally refuse to change my mind or admit even the slightest possibility that you might be right.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.