Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.
This is: by Eliezer Yudkowsky, published by Eliezer Yudkowsky on the LessWrong.
Follow-up to: An Equilibrium of No Free Energy
There’s a toolbox of reusable concepts for analyzing systems I would call “inadequate”—the causes of civilizational failure, some of which correspond to local opportunities to do better yourself. I shall, somewhat arbitrarily, sort these concepts into three larger categories:
1. Decisionmakers who are not beneficiaries;
2. Asymmetric information;
and above all,
3. Nash equilibria that aren’t even the best Nash equilibrium, let alone Pareto-optimal.
In other words:
1. Cases where the decision lies in the hands of people who would gain little personally, or lose out personally, if they did what was necessary to help someone else;
2. Cases where decision-makers can’t reliably learn the information they need to make decisions, even though someone else has that information; and
3. Systems that are broken in multiple places so that no one actor can make them better, even though, in principle, some magically coordinated action could move to a new stable state.
I will then play fast and loose with these concepts in order to fit the entire Taxonomy of Failure inside them.
For example, “irrationality in the form of cognitive biases” wouldn’t obviously fit into any of these categories, but I’m going to shove it inside “asymmetric information” via a clever sleight-of-hand. Ready? Here goes:
If nobody can detect a cognitive bias in particular cases, then from our perspective we can’t really call it a “civilizational inadequacy” or “failure to pluck a low-hanging fruit.” We shouldn’t even be able to see it ourselves. So, on the contrary, let’s suppose that you and some other people can indeed detect a cognitive bias that’s screwing up civilizational decisionmaking.
Then why don’t you just walk up to the decision-maker and tell them about the bias? Because they wouldn’t have any way of knowing to trust you rather than the other five hundred people trying to influence their decisions? Well, in that case, you’re holding information that they can’t learn from you! So that’s an “asymmetric information problem,” in much the same way that it’s an asymmetric information problem when you’re trying to sell a used car and you know it doesn’t have any mechanical problems, but you have no way of reliably conveying this knowledge to the buyer because for all they know you could be lying.
That argument is a bit silly, but so is the notion of trying to fit the whole Scroll of Woe into three supercategories. And if I named more than three supercategories, you wouldn’t be able to remember them due to computational limitations (which aren’t on the list anywhere, and I’m not going to add them).
i. For want of docosahexaenoic acids, a baby was lost
My discussion of modest epistemology in Chapter 1 might have given the impression that I think of modesty mostly as a certain set of high-level beliefs: beliefs about how best to combat cognitive bias, about how individual competencies stack up against group-level competencies, and so on. But I predict that many of this book’s readers have high-level beliefs similar to those I outlined in Chapter 2, while employing a reasoning style that is really a special case of modest epistemology; and I think that this reasoning style is causing them substantial harm.
As reasoning styles, modest epistemology and inadequacy analysis depend on a mix of explicit principles and implicit mental habits. In inadequacy analysis, it’s one thing to recognize in the abstract that we live in a world rife with systemic inefficiencies, and quite another to naturally perceive systems that way in daily life. So my goal here won't be to unkindly stick the label “inadequate” to a black box containing the world; it will be to say something about how the relevant systems a...
view more