Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.
This is: Concentration of Force, published by Duncan_Sabien on the LessWrong.
This essay began as part one of a longer piece. Part one is standalone and "timeless." Part two is focused on the local dynamics of the EA/rationality/longtermist communities and LessWrong in November of 2021. Following wise advice from Zack_M_Davis, I've split them into two separate posts. Nevertheless, I recommend that people intending to read both seriously consider reading them back-to-back, so that the content of this one is fresh in the mind. It's both something of a prerequisite and also relevantly context-setting.
Introduction
Concentration of force is a military concept (sometimes referred to as "mass"). It used to be concentration of forces, until innovations like machine guns and cruise missiles made gathering all of your actual personnel together into more of a liability.
The idea is simple. Essentially, there is a difference between relevant moments and irrelevant moments. Battles and non-battles, moments of engagement and moments between engagements.
At each relevant moment, you want to project locally superior or overwhelming force. Perhaps this means having the most soldiers/guns/tanks/planes actually present, or perhaps this just means having the right missiles pointed in the right directions.
If you are good at coordination and maneuver, you can concentrate force in most or every engagement, and consistently win even against an overall larger or more powerful opponent. This is how guerrilla warfare works—you choose the time and place of conflict in order to ensure that you outnumber the enemy in each specific encounter, and you fade into the mists before their reinforcements arrive.
Red wins each of the depicted engagements handily.
I claim that the Grey Tribe generally, and rationalists/longtermists/EAs more specifically, and LessWrong the website and community even more specifically, are systematically and spectacularly failing at concentration of force. That none of those groups puts anything like sufficient strategic energy into ensuring that critical mass coheres at crucial moments, and that each would benefit from optimizing their ability to do so quickly, reliably, and effectively, and from thinking in terms of concentration of force as a matter of habit.
I anticipate a (reasonable) objection along the lines of "mistake theory rather than conflict theory!" or "generous tit-for-tat rather than vengeful tit-for-tat!" and I assert a further subclaim that the above advice is every bit as relevant for nonviolent and nonconfrontational frames. Actual conflicts are a vanishingly small subset of the times when concentration of force is a relevant principle; the "force" in question could just as easily be e.g. "calm, generous, charitable, level-headed, clear-minded, skilled communicators arriving at exactly the moment when things were about to become wastefully contentious and adversarial."
(Indeed, that's a preview of the recommendation I have for LessWrong specifically.)
TAPs: A Motivating Example
There's a picture I tend to draw quite frequently when giving people crash courses in CFAR-esque rationality, and it looks like this:
The idea behind the picture is that you're trucking along, living a generally good and happy life, and then something happens, and you find yourself in the sad timeline. You ate an entire package of Oreos, despite intending to lose weight. You got in another fight with your romantic partner, despite really not wanting to. You road raged, you failed to finish the presentation before the deadline, you spent all evening on Reddit instead of thinking about your research, you somehow never called them back and now it's too awkward.
The point of showing people this picture is to draw their attention to two key facts:
For most goals and values, there actually exis...
view more