I found a journal article that looked at the Fukushima disaster in Japan from a systemic perspective. It argues that there were three main organization-oriented problems leading to the disaster:
1. The power plant was built in a known earthquake zone.
2. Aging of plant equipment, and intentional concealment of associated problems.
3. Deterioration of governing organizations to oversee safety and administration.
Thus, it concludes, that the disaster resulted from a failure to think in terms of “whole systems.”
Sounds good. But the platitude “just think holistically” is not enough. After all, in any disaster or mistake, one can always make that argument. It’s a fancy way of saying “I told you so.” The regulators who approved the plant’s reactor No. 1 just a month before the meltdown might have said at the time they were thinking systemically by including peculiar Japanese social and governing factors not obvious to outsiders. Then again, maybe they were just plain blind. In either case, heralding a systemic perspective into the situation, by itself, can’t work.
How do we deal with what we don’t know? That’s the relevant question. Seems to me we need a way of managing our ignorance, which, in this case, would have included ignorance of systemic problems.