Richard A. Clarke and R.P. Perry published a book in 2017 entitled: Warnings. The subtitle is “Finding Cassandras to Stop Catastrophes”.
I find the book remarkable because it approaches the problem of complex systems not only from a different angle than sustainment, but from a completely different ballpark. And yet their findings reinforce many key aspects of the Complex System Sustainment Management Model.
Cassandra was a princess of ancient Troy. The god, Apollo, gave Cassandra the ability to see catastrophe approaching. But he also gave her the curse that no one would believe her warnings. Operationally, the opposite of a Cassandra is the fairy tale character, Chicken Little, whose repeated concerns that the sky is falling should not to be taken seriously.
Clarke and Perry use this new book to detail seven recent disasters and their modern day Cassandras. That is, credible people gave us warning in each of these cases, but were ignored. (Kuwait invasion, Hurricane Katrina, the rise of ISIS, Fukushima nuclear disaster, Madoff’s Ponzi scheme, the Upper Big Branch mine disaster, and the 2008 recession.) By analyzing these events, they feel they have uncovered important characteristics of a) the warnings, b) the people who could have made decisions to avert the disasters, c) the Cassandras, and d) the Cassandras’ critics. The purpose of the book is to teach the reader to learn these characteristics so that they can separate the Chicken Littles from the Cassandras.
They do this in chapter 9, with a “Cassandra Coefficient”. The following paragraphs discuss what Clarke and Eddy said and how this apply to the Complex System Sustainment Management Model.
- Is there a useful response to the Cassandra’s warning? If not, then this is not a true warning as the outcome is inevitable. This reminds us of those engineers to who come to sustainment risk meetings to identify a risk to the mission readiness that has already been realized. Thus, it is no risk. It is a current crisis.
- Are most people poo-poohing the warning because it has never happened before? This should not be a trap for sustainment because risks to the sustainment of the system are usually those emerging failure modes that were never foreseen by the design engineers. A system that lives decades past its design life will display failure modes that have never been seen before. To deal with these issues before they affect the mission (in other words, effective sustainment) means that early symptoms must be noticed and taken seriously.
- Very closely related to this is the danger of expert consensus. If the Cassandra can point to solid evidence of an impending serious problem based on emerging failure modes, a consensus of expert designers sometimes also arise to say the outcome couldn’t possibly be all that bad. Are they basing their opinion on data and facts, or just tradition? Focusing the experts on the Cassandra’s data and analysis will often result in winning them over to the idea that dire times are indeed coming this way.
- If the problem being described is huge, many will shy away thinking that anything that big couldn’t possibly be happening. This is cured by moving away from emotions to mathematical analysis.
- If the horrible event on the horizon appears outlandish, resistance is inevitable. No matter how much solid evidence a Cassandra has of industrial espionage in the sustainment office, for instance, many will say he or she has just been watching too many cheap TV plots simply because the risk being proposed sounds so outlandish. Again, intervention is required by leaders to view the data objectively.
- The last category of warnings is the “invisible obvious”. The problem is obvious, but only if you are looking in that direction. This category comes down to the old adage: “It is amazing what you can discover by looking.” If the sustainment office has a thorough system observation and assessment program that covers all readiness factors (e.g. reliability, accuracy, hardness from attack) then there will be no invisible problems.
- In many of these disasters, Clarke and Eddy found that those who might have made a difference did not see themselves in that role due to diffusion of responsibility. In complex system sustainment, it is wrong, but typical, to spread out the sustainment of a system across many different organizations and locations that all possess complicated reporting chains. If a fleet of near space drones providing world-wide internet links suddenly finds itself with falling availability rates who should have seen it coming and dealt with it before it happened? The avionics repair depot in Chicago, the engine overhaulers in Los Angeles, or the mission planners in New York? The good sustainment organization has responsibility for the entire system and the entire mission.
- The current agenda could steal all the resources needed to deal with the impending disaster. Under the sustainment management model, all risks are brought to the monthly risk review meeting with the goal of complete understanding. The output is a prioritized list, hopefully created early enough to be able to deal with issues before they affect the mission. The question of resources comes along after the prioritized list of well-defined issues exists.
- The decision-makers might feel overwhelmed by the complexity of the issues being presented. They might even bluster to hide their feelings of inadequacy. The sustainment management model avoids this trap because the model informs all sustainers that they must eventually take all their data and information and reformat it in a package that convinces decision-makers to release funding. This is simply part of their job.
- Pre-conceived notions are a difficult, sometimes impossible, bias to fight. Sometimes a person’s preconceived notions or ideologies can bias them to inaction. If the solution requires actions that go against their world-view, it is easier to disbelieve the magnitude of the impending disaster. In the sustainment world, this sometimes happens when risks are identified against the inadequacy of the current observation and assessment of the system. Typical rebuttals are: “You engineers always want more and more data. You are never satisfied.” Under a good sustainment program, the question should be: “Are we covering all parts of the system against all readiness factors?”
- Sometimes the decision-makers simply lack the personal courage to make waves. By institutionalizing the discovery of risks via the complex system sustainment management model, cowards are, at the very least, exposed for who they are.
- Another tactic of decision-makers is to push forward an inadequate response to the risk. Under the scrutiny of the monthly risk meetings and annual risk reviews, these faux mitigations get exposed for what they are.
- The last of the issues that Clarke and Eddy found in un-averted disasters was the inability of the decision-makers to recognize that something unusual was emerging and needed to be addressed. Under the sustainment management model, everyone in the organization is encouraged to come forth with the evidence of emerging problems.
- Clarke and Eddy found that real Cassandras were, first and foremost, proven technical experts. This is a lesson for the manager and executive pulling together and preserving their sustainment team. Find experts and keep them.
- Often experts acclaimed and sought for their expertise, can have character flaws such as off-putting personalities. These people, with confidence born of usually being right, are told they are wrong, they can become even more abrasive. This is seen quite a bit in sustainment of complex system. Some have been experts since the design phase of the system and are now experts at how their much older system acts.
- Great experts are always data-driven. They pay attention to where the data takes them, not where they expect it to go.
- Experts are usually orthogonal thinkers, coming at the same problem from many different angles.
- Experts, given all the above, make the best skeptical questioners.
- In addition, experts have a deep sense of personal responsibility that makes them want to stick with the complex system that they are experts in.
- Unsurprisingly, the attributes above are often accompanied by anxiety.
- Critics of Cassandras are not a bad thing. Warnings must be tested. But look out for these common over-tests: A demand for final scientific proof is often unrealistic. Especially in sustainment of complex systems actions usually must be taken early to be effective.
- Sometimes critics must be vetted to look for personal or professional investments in opposite conclusions. Perhaps a petroleum engineer is not the best person to evaluate wind farms nor should a wind-farm expert be placed on a nuclear power plant review team.
- Sometimes the loudest critic turns out to be a non-expert in area you are trying to evaluate. The nation’s top scientist may conceivably have no background in low earth orbit power navigation. Do they look to the real experts or attempt to form an opinion without the correct background?
- Perhaps the most damaging critic is the one who loudly agrees with the Cassandra, but then retreats to a “now is not the time” argument. A good sustainer has a bias to early action to avoid damaging the mission.
I highly recommend anyone with an interest in good sustainment to purchase and read Clarke and Eddy’s book: Warnings.