Table of Contents
Rational about Rationality
You don’t need science to survive, but you need to survive to do science.
The best definition of rationality: actions that increase the chances of survival.
Herbert Simon formulated the idea of bounded rationality; we can’t deal with the world as if we were a computer – we need shortcuts and distortions under evolutionary pressures. Gerd Gigerenzer leads an effort to solve this problem by showing that what first appears to be irrational, often has deep reasons behind it that are not obvious. Ken Binmore thought that the word “rational” was badly defined. There is nothing about beliefs that can be considered innately irrational.
Beliefs are…cheap talk. There may be some type of a translation mechanism too hard for us to understand, with distortions at the level of the thought process that are actually necessary for things to work.
WHAT IS RELIGION ABOUT?
Taleb thinks that religion exists to enforce tail risk management across time.
Superstitions can help you avoid ruin. Jared Diamond discusses the idea of “constructive paranoia” of Papua New Guinea residents. Their superstition prevents them from sleeping other dead trees. It doesn’t matter why you do it, scientific reasons or otherwise, it’s only important that you do it (avoid sleeping under dead trees).
When you consider beliefs in evolutionary terms, do not look at how they compete with each other, but consider the survival of the populations that have them.
ERGODICITY
Ergodicity is when past probabilities can inform future events, a situation is non-ergodic when past probabilities no longer apply. For example, Russian roulette is non-ergodic because there is a risk of ruin present. Imagine a game where every round you survive will result in a gain of $833,333, and you have an 83.33 percent chance of winning, the cost-benefit analyses would be pointless, since repeated games will result in death. When the risk of ruin is present, the rate of return doesn’t matter.
REPETITION OF EXPOSURES
In real life, every single bit of risk you take adds up to reduce your life expectancy. If you climb mountains and ride a motorcycle and hang around the mob and fly your own small plane and drink absinthe, and smoke cigarettes, and play parkour on Thursday night, your life expectancy is considerably reduced, although no single action will have a meaningful effect. This idea of repetition makes paranoia about some low-probability events, even that deemed “pathological,” perfectly rational.
It might be intuitive to rest easy if medicine is improving your expectancy, but you should be more paranoid, since you would expose yourself to more “ruin” situations that have a low probability of occurring.
RATIONALITY, AGAIN
Warren Buffet did not make his money through cost-benefit analysis, but by establishing a high filter and picking opportunities that pass this threshold.
“The difference between successful people and really successful people is that really successful people say no to almost everything,” he said. Likewise our wiring might be adapted to “say no” to tail risk. For there are a zillion ways to make money without taking tail risk. There are a zillion ways to solve problems (say, feed the world) without complicated technologies that entail fragility and an unknown possibility of tail blowup.
LOVE SOME RISKS
Small injuries can be beneficial, as shown in Anti-fragile, while large ones can have irreversible effects. What is volatile is not necessarily risky, and what is risky is not necessarily volatile. Taking risks is good and necessary, but this is not the same as doing something that might result in ruin.
One may be risk loving yet completely averse to ruin. The central asymmetry of life is: In a strategy that entails ruin, benefits never offset risks of ruin. Further: Ruin and other changes in condition are different animals. Every single risk you take adds up to reduce your life expectancy. Finally: Rationality is avoidance of systemic ruin.