Just read The Unaccountability Machine by Dan Davies, which is mostly a cybernetic (as in, the study of decision-making systems) discussion of how organisations such as corporations make decisions, and how they make bad decisions partly because the system is constructed to avoid any given individual making a decision that would make them accountable (“the accountability sink”).
The insight of FOSIWID (the function of the system is what it does) really stuck with me. If a hospital hurts more people than it helps, then its function is to hurt people, no matter what everyone involved thinks. The function of the animal shelter at Schiphol Airport in 1999 was to shred squirrels. The function of Facebook is to gather and sell personal data.
There’s also this, about how more knowledge and expertise about a complex system isn’t necessarily helpful:
“It is a sobering thought, for example, that despite employing some of the best and brightest analysts in the world, the advice given by the US State Department over the last fifty years could comfortably have been outperformed by a parrot that had been trained to repeat the phrase, ‘Don’t start a war.’ The repeated failures of the State Department are not the consequence of ignorance; they are the consequence of having very good and deep – but not total – knowledge of an extremely complicated situation, in which facts outside of that information set turned out to be crucial.”