Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Duct Tape security, published by Isaac King on April 26, 2024 on LessWrong.
This is a linkpost for On Duct Tape and Fence Posts.
Eliezer writes about fence post security. When people think to themselves "in the current system, what's the weakest point?", and then dedicate their resources to shoring up the defenses at that point, not realizing that after the first small improvement in that area, there's likely now a new weakest point somewhere else.
Fence post security happens preemptively, when the designers of the system fixate on the most salient aspect(s) and don't consider the rest of the system. But this sort of fixation can also happen in retrospect, in which case it manifest a little differently but has similarly deleterious effects.
Consider a car that starts shaking whenever it's driven. It's uncomfortable, so the owner gets a pillow to put on the seat. Items start falling off the dash, so they get a tray to put them in. A crack forms, so they tape over it.
I call these duct tape solutions. They address symptoms of the problem, but not the root cause. The underlying issue still exists and will continue to cause problems until it's addressed directly.1
Did you know it's illegal to trade onion futures in the United States? In 1955, some people cornered the market on onions, shorted onion futures, then flooded the market with their saved onions, causing a bunch of farmers to lose money. The government responded by banning the sale of futures contracts on onions.
Not by banning futures trading on all perishable items, which would be equally susceptible to such an exploit. Not by banning market-cornering in general, which is pretty universally disliked. By banning a futures contracts on onions specifically. So of course the next time someone wants to try such a thing, they can just do it with tomatoes.
Duct-tape fixes are common in the wake of anything that goes publicly wrong. When people get hurt, they demand change, and they pressure whoever is in charge to give it to them. But implementing a proper fix is generally more complicated (since you have to perform a root cause analysis), less visible (therefore not earning the leader any social credit), or just plain unnecessary (if the risk was already priced in).
So the incentives are in favor of quickly slapping something together that superficially appears to be a solution, without regards for whether it makes sense.
Of course not all changes in the wake of a disaster are duct-tape fixes. A competent organization looks at disasters as something that gives them new information about the system in question; they then think about how they would design the system from scratch taking that information into account, and proceed from there to make changes. Proper solutions involve attempts to fix a general class of issues, not just the exact thing that failed.
Bad: "Screw #8463 needs to be reinforced."
Better: "The unexpected failure of screw #8463 demonstrates that the structural simulation we ran before construction contained a bug. Let's fix that bug and re-run the simulation, then reinforce every component that falls below the new predicted failure threshold."
Even better: "The fact that a single bug in our simulation software could cause a catastrophic failure is unacceptable. We need to implement multiple separate methods of advance modeling and testing that won't all fail in the same way if one of them contains a flaw."
Ideal: "The fact that we had such an unsafe design process in the first place means we likely have severe institutional disfunction. We need to hire some experienced safety/security professionals and give them the authority necessary to identify any other flaws that may exist in our company, including whatever processes in our leadership and hiring teams led to us not having such a security team ...
view more