Home

> urticator.net
  Search

  About This Site
  Domains
> Glue
  Stories

  Art
  Memes
> The Mind
  The Body
  Language
  Philosophy
  Strategies
  Other
  Other (2)

  Association
  Cognitive Dissonance
  Enthusiasm
  Extremes
  Formulation
  Fossilization
  Indecision
  Juxtaposition
> Not Liking Uncertainty
  Polarization
  Projection
  Rationalization
  Vision

Not Liking Uncertainty

The idea I want to talk about here is that people don't like uncertainty—or, to put it another way, that there's something about uncertainty that makes it an unpleasant experience for people, much like cognitive dissonance. The idea isn't just my own, but unfortunately I can't remember where else I've seen it.

Of course, by “uncertainty” I don't mean just any old uncertainty. I might be uncertain whether the mail has arrived, but I can always go check, and anyway it's no big deal one way or the other. To be unpleasant, the uncertainty has to satisfy certain conditions.

  • First, the uncertainty has to be important to you. And what makes an uncertainty important? Being about someone or something important.
  • Second, the uncertainty has to be unresolvable by you. I say “by you”, but that isn't as narrow a condition as you might think. If someone else can clear up the uncertainty, then you can usually resolve it by talking to that someone. So, an uncertainty is unresolvable if there's nothing you can do to resolve it, either directly or indirectly.
  • Third, although it's not an absolute condition, the uncertainty is that much more unpleasant if it's indefinite. And what do I mean by that? Well, almost everything gets resolved sooner or later, at some definite or indefinite future time. So, an uncertainty is indefinite if you don't even know when it's going to be resolved.

As a nice impersonal example, consider the uncertainty following the presidential election in 2000. It was important, indefinite, and, for most of us, unresolvable. Was it unpleasant? Yes, but there's more to it than that, as I should have explained earlier. For me, at least, uncertainty is unpleasant in a very specific way. I get fixated on wanting the uncertainty to be resolved, and I don't want to think about anything else in the meantime.

Although I tend to think of it that way, uncertainty doesn't have to be only about facts and outcomes, it can also be about (for example) reasons—why someone did something, why something happened. That kind of uncertainty, unfortunately, doesn't always get resolved in the end. Sometimes, in fact, things happen for essentially no reason at all. Then, I think, not liking uncertainty pushes the mind to invent reasons—religion, among others.

The following passage, from Why People Believe Weird Things, is interesting mainly because it's the only reference to the idea of not liking uncertainty that I could find, but also because it makes the same connection between explanations (reasons) and certainty.

Most of us, most of the time, want certainty, want to control our environment, and want nice, neat, simple explanations. All this may have some evolutionary basis, but in a multifarious society with complex problems, these characteristics can radically oversimplify reality and interfere with critical thinking and problem solving.

Finally, here is one more thought I've had. If you're mathematically inclined, like me, you might be tempted to take the following view of things.

Uncertainty is really just a matter of probabilities and outcomes; not liking uncertainty is nothing but the fact that a small probability of a bad outcome is still relatively bad, i.e., has a negative expectation value.

I can see two problems with that view. For one thing, nothing I've said about uncertainty implies a bad outcome! We could equally well be talking about a small probability of a good outcome, which ought to be a good thing. And, perhaps it is, but to me it seems diminished by the presence of uncertainty. That leads to a different view.

Not liking uncertainty is the fact that there is a small cost or penalty that modifies the expectation value.

I don't think there's a real connection, but it's amusing to me that uncertainty carries a certain amount of information, i.e., of complexity, and so ought to be considered as a cost.

The real problem (with either view) is that expectation values are a tool for rational thought, and not liking uncertainty is essentially irrational. It is a quirk of the mind, which of course is why I'm writing about it here. If I'm troubled by an important, unresolvable uncertainty, you can adjust the expectation value all you like—by, say, paying me money—and I'll still be troubled.

 

  See Also

  Fire, The
  Projection
  Restaurant Effect, The

o August (2000)
@ July (2002)
  January (2024)