Sunday, January 27, 2008

More heat than light

First question: why is it so much harder to talk about politics than, say, the weather? The obvious answer is that people get a lot more emotional about their political beliefs than their meteorological ones. Okay, true enough. But why all the strong feelings? Well, because what's at stake is nothing less than who controls whom, and for what purpose.

That's probably part of it, but it's not the whole story. Most people know that their political views are very unlikely to have significant effects on the behavior of the government, much less its reach into their own lives (yes, I know, here I am blathering away as if it were otherwise). So here's another angle: emotional capital invested in political (/religious) beliefs is safe.

My mistaken beliefs about everyday reality have serious practical consequences; they are their own punishment, so it is in my interest to correct them. I had better get the behavior of unsupported objects right, or I risk dropping a hammer on my toe. Reality doesn't spank me, however, for my mistaken beliefs about history, political economy, or the origin of the universe. I will not endanger my toe if I mistakenly believed in the efficiency of markets or the historical inevitability of communism or the second coming of Christ. The predictable--and, in a sense, rational--result is that I tend to choose my religious and political beliefs for reasons other than their predictive/explanatory power (that is, their truth). The ideas I latch on to might just be in the air, evolving analogously to genetic drift. More interestingly, these ideas may do a lot for their hosts, and be subject to a lot of selective pressures. They may comfort, flatter, and justify me, and give purpose to my life. They may project to others my benevolence, cooperativeness, or respect for authority. They may strengthen my bonds with fellow believers and focus our collective hatred. They may reduce cognitive dissonance with the hogswallop I must recite to avoid persecution.

This kind of non-explanatory "belief" formation might be rational, in the utility-maximizing sense, but it is certainly not honest. Entertaining these notions as merely beautiful or inspiring or conventional in my tribe doesn't get the job done; I must also, in some sense, "believe" them--that is, take them to be true. And the dishonesties compound; a cherished "belief" must be protected from arguments and evidence that undermine it, whether by averting ones eyes or by gouging out those of heretics.

I keep putting "belief" in quotes because I don't think
  1. my belief that I am sitting at my computer and typing and
  2. my "belief" that I will be fed peeled grapes by seventy-three virgins (one better than that other religion down the street!) in the afterlife
are the same sort of animal at all. My belief that I am sitting in front of a computer predicts that:

  • If asked, I say that I am sitting in front of a computer.
  • I behave as if I am sitting in front of a computer. I wiggle my fingers over the keyboard in a way that would be strange and useless if I were not. My certainty that I am sitting in front of a computer could be calibrated by my willingness to bet on it.
(2) is usually evidenced by the saying, not the behaving. Most of the time, behaving as if you believed that blah blah blah virgins etc. after death (or that markets are efficient, or that communism is inevitable) is indistinguishable from behaving as if you believed otherwise. Some (e.g., Sam Harris) take the willingness of young men to fly planes into buildings as evidence that they really believe. To me, such dramatic gestures smell of uncertainty; those men were trying to convince themselves, or somebody. Nobody makes such a grand show of his belief that water becomes a solid when it gets cold, or that unsupported objects fall. All it shows is that other relations to propositions, besides belief (say, wanting them to be true; or wanting others to believe that you believe them to be true; or taking offense at being called a liar, regardless of the fact that you know you are lying...) can be powerful motives for behavior.

Other notable properties of "beliefs" include a sense of commitment, of personal importance to the holder, and unwillingness to be swayed by (or even active avoidance of) evidence or argument against. If someone challenged my belief (1), I would be surprised, but probably not angry. If I found out that this was not a computer, but, rather, an empty box wired to some distant computer, I would be very surprised, but my identity would not be threatened. Beliefs can be held, but only "beliefs" can be cherished, or threatened, or cause incompatible beliefs to be treated as heresy.


So where does this leave us, here with the goal of having a non-heated political discussion? Hopefully, in a mood of critical introspection. My strength of conviction should not be mistaken for my degree of certainty, and neither should yours.

4 comments:

alice said...

that... was fantastic. i wish i had better words at my disposal to voice my opinion, but alas i don't. i "believe" you made some very good points, though.

Strock Bromsten said...

I like the idea of some sorts of beliefs being of a different type than others.

Here's something that I couldn't get straight in what you were saying -- on the one hand I think you are saying at the emotional fervor around politics or religion is safe because their kind of non-empirical, or empirically non-consequential. But then you have 9/11 as smacking of perps uncertainly about their politicoreligious ideas -- maybe that they are not working out, or that the modern world seems to be at odds with them. Killing yourself and thousands because of a politicotheological unease seems at odds with those sorts of beliefs being safe. Maybe it is because they are empirically non-consequential that they are so emotionally charged?

Sean said...

What I meant by "safe" is that at least some politicoreligious notions are emotionally attractive, in part, because they have no practical consequences. I think these notions usually are "empirically consequential", in that they make predictions, though often very diffuse.

The bad feeling (intellectual conscience?) of avowing ideas that are obviously false, but emotionally valuable and not damaging to ones practical interests, is a topic for another day (year? I need more spare time...).

Amy. said...

The big question, then, is how do you handle "beliefs" in areas where the empirical facts of the matter are mushy? You have political opinions/beliefs, that are based on "principles" that are based on ideas about human nature that may or may not be correct, as they're essentially untestable. But you (and I) generally think you're right, and that it's not a futile exercise to think/argue about it. Seems like lots of things are like this--politics, economics, religion (although that seems easier...

Another question--are there situations in which the emotional truth of the matter could actually be as important/correct as the rational one?