Pages

20171118

WARNINGS by Richard A. Clarke, R.P. Eddy


  • The Cassandra problem is not only one of hearing the likely accurate predictions through the noise, but of processing them properly once they are identified.
  • First we must hear the forecast, then believe it, and finally act upon it.1 In practice, these steps are each individually challenging.
  • Cassandras are seldom appreciated for their efforts even after the disaster comes to pass.
  • “The fox knows many things, but the hedgehog knows one big thing.”
  • The hedgehog style of thinking is marked by a belief in the truth of “one big thing”—such as a fundamental, unifying theory—and then “aggressively extend[ing] the explanatory reach of that one big thing into new domains.”
  • Conversely, a fox’s cognitive style is marked by “flexible ‘ad hocery’ that require[s] stitching together diverse sources of information.”
  • The foxes consistently beat the hedgehogs in the accuracy of their predictions by a significant margin.
  • Predicting natural phenomena is stymied by the chaotic nature of the universe: natural processes are nonlinear systems driven by feedback loops that are often inherently unpredictable themselves.
  • Alternative history is a parlor game.
  • Instead, it was left implicit that if a phenomenon had never happened before, it never would. As illogical as that belief usually is, it has proven a powerful determinant in the way people react to Cassandras.
  • Cognitive biases worked well when rapid pattern recognition and decision making was critical for survival, but in the modern world these processes can have unintended consequences.
  • The availability bias tends to prejudice our interpretation and understanding of the world in favor of information that is most accessible in our memory, things that we have experienced in the recent past.
  • Personal experiences are often part of what drives Cassandras to make the choices they make.
  • The late Harvard historian Samuel Huntington proposed that, in analyzing leaders, it is always good to know what world events and personal experiences shaped them while they were young and their world view was being formed.
  • Across all fields we have studied, expert Cassandras often make appeals to the compelling nature of data they themselves collected or which they endorse as important. The data speaks for itself, the Cassandras believe. For non-experts, however, obscure data or variables that they have never heard of before do not speak to them in the same way, and they understand them no more than they would a sentence in a language they don’t know.
  • First, because they are calling out disasters beyond our collective horizon, Cassandras are often seen as pessimistic, naysaying nuisances.
  • Initial Occurrence Syndrome is a potent obstacle to accurate evaluation of a warning.
  • The lesson of 9/11 had been never to let a terrorist group establish a refuge from which it could launch attacks.
  • Earthquakes are a very real and significant risk to nuclear power plants, warranting special measures.
  • A Ponzi scheme involves no trades and no investments. It is a pure
  • Psychology can trump rational decision making,
  • “Math is truth, finance is bullshit,”
  • Mining has long been recognized as one of the most dangerous occupations, and it remains so even today.
  • As we have seen, opposition to federal regulation by the coal companies has existed since the dawn of the industry. Mining companies maintained that they could regulate themselves and, as technical experts, were best suited to determine how to make their mines safer. Decades of horrific accidents and the unnecessary loss of thousands of miners’ lives proved this assertion untrue and resulted in slow but significant steps toward improved safety standards, stricter regulation, and genuine enforcement power by the USBM and later MSHA.
  • A Cassandra is only as good as her last correct prediction.
  • A common error is that we believe that “people who talk in a way you don’t understand know exactly what they are talking about, when actually, people who talk in a way you do understand know exactly what they are talking about.”
  • “Hubris is the cause of management mistakes 90 percent of the time.”
  • One man with the truth constitutes a majority.
  • We call this, our technique of identifying predicted future disasters that are being given insufficient attention today, the Cassandra Coefficient. It is a simple series of questions derived from our observation of past Cassandra Events. It involves four components: (1) the warning, the threat, or risk in question, (2) the decision makers or audience who must react, (3) the predictor or possible Cassandra, and (4) the critics who disparage or reject the warning.
  • For each of the four components, we have several characteristics, which we have seen appear frequently in connection with past Cassandra Events.
  • Our experience with senior decision makers in governments and corporations, however, suggests to us that such leaders prefer something they can unpack, understand, and apply themselves.
  • Our proposition is straightforward. We have seen experts ignored in the past, when paying attention to them might have prevented or reduced the scope of calamities. In many of those cases, the same factors were at work over and over again. We can list them. If you see those things happening in that combination again, now or in the future, you may face a problem that deserves more attention and the application of a more diligent, rational, and unbiased analysis.
  • INITIAL OCCURRENCE SYNDROME: In many cases, the event foretold has “never happened before,” at least not in the cultural memory of the audience, who will therefore resist taking the threat seriously. In our estimation, no obstacle to action is bigger than Initial Occurrence Syndrome, yet it is the easiest objection to logically assail.
  • History is full of examples of things happening for the first time.
  • Social psychologists use the term “cognitive bias” to describe the filters, blinders, or limits we place between our points of view and reality.
  • If the warning is about something that has never happened in the historical record or personal experience of most people, it is likelier to be ignored.
  • The majority of relevant experts may not initially agree with the predictor who sees something new, but in times of uncertainty, the authority of expert groups can be deceiving.
  • Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. . . . The greatest scientists in history are great precisely because they broke with consensus.
  • Decision makers throughout history have found comfort in the consensus of experts and rejected or persecuted the outlier who was often later proven correct.
  • If the issue or risk defies a long-standing consensus because of new evidence, it should be further examined despite being a minority view.
  • MAGNITUDE OVERLOAD: Overly large-scale events or phenomena can have two negative effects on decision makers.
  • First, the sheer size of the problem sometimes overwhelms and causes the manager to “shut down.”
  • Second, the decision maker may not be able to properly magnify their feelings of dread or empathy for disasters predicted to have massive death and loss. They simply may not be able to grasp how enormous a threat they face.
  • Human emotions take place within an analog brain.
  • Richard Farson notes, “The most important discoveries, the greatest art, and the best management decisions come from taking a fresh look at what people take for granted or cannot see precisely because it is too obvious.”5 Farson calls this the “Invisible Obvious.”
  • Not only can we be blinded to things because of their ubiquity or obviousness, we can also be blind to critical information because of how powerfully our attention can focus in other directions.
  • Cassandra Events often occur when the data is obvious if you look at it but most people are concentrating their attention elsewhere.
  • DIFFUSION OF RESPONSIBILITY: Often it is not clear whose job it is to detect the warning, evaluate it, and decide to act.
  • Frequently, no one wants to own an issue that’s about to become a disaster. This reluctance creates a “bystander effect,” wherein observers of the problem feel no responsibility to act.
  • Increasingly, complex issues are multidisciplinary, making it unclear where the responsibility lies. New complex problems or “issues on the seams” are more likely to produce ambiguity about who is in charge of dealing with them. This phenomenon is especially true when Initial Occurrence Syndrome is involved.
  • AGENDA INERTIA: Most organizations and their leadership have an agenda to which they are devotedly attached. Such groups are subject to Agenda Inertia, a force that concentrates focus on issues already in the plan.8 Unanticipated threats, ones that the leadership didn’t see coming and doesn’t really want to deal with, tend to have a difficult time crowding out preconceived agenda items.
  • Often a warning of catastrophe requires explanation or interpretation by experts and involves some technical understanding which may not be common in the audience of decision makers.
  • Systems can be so complex that even experts can’t see the disaster looming within. Complexity mismatch is a looming threat for government.
  • If an issue is so complex that no one person fully understands it, there may be an increased likelihood of a Cassandra Event.
  • Some decision makers reject the only available responses to a risk because those paths don’t conform to their ideology, for example about the proper role of government or science.
  • If believing in the risk leads inevitably to an ideologically repugnant response, some decision makers will reject the warning altogether.
  • It takes personal courage for leaders to detect and evaluate a warning, determine that a disaster is coming, and order resources expended and lives disrupted to deal with the risk.
  • Decision makers faced with a warning of disaster often do respond in some way, an insufficient way. Faced with the risk that Cassandra is correct, but not really understanding the technical aspects of the issue sufficiently to make a personal judgement, decision makers will often make a token response.
  • Ordering additional studies is a typical satisficing strategy, but may be appropriate if there is time and if the analyses truly are incomplete or insufficiently tested.
  • Information that is not routine and should be rapidly escalated to higher-level decision makers is often instead placed in the queue for consideration in due course by systems not designed for determining whether an alarm is urgently required.
  • successfully finding Cassandras is unlikely to happen unless relevant organizations, both in government and the private sector, foster a culture poised to listen to warnings, and empower and train small cadres to recognize and test sentinel intelligence.
  • The Cassandras we have seen are recognized in their fields as competent experts.
  • “Contrary to popular belief, most successful innovators are not dropout geniuses, but well-trained experts in their field.
  • Usually Cassandras have not given a dire warning before, or if they have, they were clearly proven to be right.
  • Because of the frustration of seeing a threat and wondering why others can’t, because of their personal sense of responsibility for promoting understanding and action on their discovery, and perhaps because of their high level of anxiety in general, Cassandras may at times appear obsessive and even socially abrasive.
  • most people are unable to differentiate a message from the messenger.
  • Cassandras’ warnings are not generated on the basis of intuition or “analyst’s judgment.” They are driven to their conclusions by empirical evidence. Often they are the first ones to generate or discover the data, but the evidence is usually not in question. It is their interpretation of the data that makes them step away from the previous consensus. They tend to see the problem leaping out of their data with a clarity that makes them unique.
  • Most Cassandras tend to disbelieve anything that has not been empirically derived and repeatedly tested.
  • Cassandra is the person in the crowd who smells the smoke first and is sufficiently confident in her judgment and so filled with a sense of personal responsibility that she is the first to call 911 or pull the fire alarm.
  • For some issues, a high scientific standard of proof cannot be met in time to act. Some events cannot be accurately created, simulated, and repeated in the laboratory. To avoid the disaster, it may be necessary to abandon the normal protocol of waiting for all the evidence to be in and act instead on incomplete data and early indications.
  • People with something to lose from the revelation of a risk, or from the solutions, may criticize Cassandras for illegitimate reasons based on this self-interest.
  • Some experts react negatively out of simple jealousy that another expert, or worse, an outsider, got there first or is getting a lot of publicity.
  • Some ideas, once exposed to peer review, gain general expert acceptance but are still heavily assailed by non-experts, many of whom may have a vested interest either in keeping things as they are or otherwise not acting to prevent or mitigate the disaster.
  • Some opponents to acting on the basis of the evidence of an impending disaster know better than to deride the experts or question their technical data and analyses. Instead, they seek to minimize the urgency or defer consideration to some vaguely defined future.
  • History teaches us that decision makers make exactly the wrong decision about Cassandras because of mental strategies we all employ subconsciously.
  • The problem with heuristics is that they tend to become cognitive biases, inaccurate prejudices against one decision or one person versus the alternative.
  • Rapidly judging other people is the primary skill for which our prehistoric brains are optimized, but this skill may be a dangerous barrier injecting subjective bias where we’d be better served by objective rationality.
  • In addition to unconscious biases, a frequent reason decision makers ignore Cassandras is fear of being embarrassed by embracing someone who turns out to be wrong.
  • If you decide to act in a big way to solve the alleged problem, it is, of course, less risky if the costs of your being wrong are limited and are clearly outweighed by the potential impact of not acting or doing so insufficiently.
  • While technology exists for AI-driven weapons, we humans are not quite up to speed on how safe they are or will be, or how to use them.
  • Artificial intelligence is a broad term, maybe overly broad. It simply means a computer program that can perform a task that normally must be done by a human.
  • Machine-learning programs “have the ability to learn without being explicitly programmed,” optimizing themselves to most efficiently meet a set of pre-established goals.
  • Superintelligence is an artificial intelligence that will be “smarter” than its human creators in all the metrics we define as intelligence.
  • The astrophysicist and Nobel laureate Dr. Stephen Hawking warns that AI is “likely to be either the best or worst thing ever to happen to humanity, so there’s huge value in getting it right.”
  • His thesis is simple, though his solution is not: if we are to have any hope against superintelligence, we need to code it properly from the beginning. The answer, Eliezer believes, is one of morality. AI must be programmed with a set of ethical codes that align with humanity’s.
  • Humans will relentlessly pursue the creation of superintelligence because it holds unimaginable promise to transform the world.
  • A superintelligent computer will recursively self-improve to levels of intelligence that we may not even be able to comprehend, and no one knows whether this self-improvement will happen over a long period of time or within the first second of being turned on.
  • Artificial intelligence has the potential to be dramatically more powerful than any previous scientific advance.
  • Nearly all AI theorists predict superintelligence is coming and that it will so vastly eclipse the cognitive capacity of even the smartest humans that we simply cannot comprehend its intellectual horsepower.
  • Once humans are no longer the most intelligent species on the planet, humankind will survive only at the whim of whatever is.
  • Superintelligence gone wrong is a species-level threat, a human extinction event.
  • Humans are neither the fastest nor the strongest creatures on the planet but dominate for one reason: humans are the smartest.
  • Yudkowsky takes a highly pragmatic approach to the world and communicates with others according to what are called Crocker’s rules, one of which holds that the most efficient path of communication is always the best, even at the expense of social niceties.
  • Unemployment is corrosive to government stability and calls for remarkably deft leadership, lest the nation collapse.
  • Described as the third revolution in warfare, autonomous weapons can select targets and fire without any human intervention.
  • The Grim Reaper’s favorite disguise is disease. Disease makes other disasters look trivial. More human lives have ended by bacteria and viruses than every other kind of catastrophe combined, its constant presence masking its destruction. Even war flags behind.
  • Disease is now and has forever been humankind’s greatest killer, and it certainly is not being ignored; health care is one of the world’s biggest industries.
  • Influenza is amazingly adaptable. It changes lethality and transmissibility quickly and jumps from animal to human more readily than any other disease. New flu mutations emerge daily, some proving more contagious than others.
  • Warning of disaster is a complicated and tricky business, particularly when the solutions are expensive.
  • Fossil fuels are cheap only because they do not pay their costs to society and receive large direct and indirect subsidies.
  • The reality is that no one really knows what will happen when one side starts using nuclear weapons against another side that is also nuclear armed.
  • The problem is that the health care industry, which is rushing headlong into the IoT, has a bad track record when it comes to cyber security.
  • Often medical devices are running on software that is so old that its creators no longer try to fix it, akin to an old version of Microsoft Windows from the 1990s.
  • Conventional malware is rampant in hospitals because of medical devices using unpatched operating systems.”
  • Complexity is growing in the very machines that control the underpinnings of modern societies, nearly all of which are being developed and deployed with substantial cyber vulnerabilities.
  • Using those lessons and the cases studies presented in this book, we suggest addressing the four necessary functions as follows:
    • 1. SCANNING FOR PROBLEMS
    • 2. SEPARATING THE SIGNAL FROM THE NOISE
    • 3. RESPONSES: SURVEILLANCE, HEDGING, MITIGATING, AND PREVENTING
    • 4. PERSUADE
  • Professional experts giving warnings do not hide under rocks.
  • Because many of the problems we examined have costly mitigation or prevention strategies, maybe even requiring substantial changes in the way we live, doing just enough to quiet demands for action, while failing to actually solve the problem, tends to be the easiest of solutions.
  • Prevention of a disaster is the best course of action when feasible in time and cost.
  • Our combined six decades of experience in the government and private sector tells us that only leaders who are convinced of the urgency of an issue get things done.
  • Complexity hides vulnerabilities, creating new problems or complicating existing ones. Nowhere do we believe this tendency is clearer than in the inevitable convergence of two fields: artificial intelligence and the Internet of Things.
  • We believe that the recent decades of scientific and engineering breakthroughs place us on the cusp of a brave new world, one that promises great improvement for humanity. To achieve that elevated status, however, we must be on the lookout for the potential concomitant catastrophes that would slow or reverse that progress, as continued progress and improvement are not inevitable. Thus, we must systematically identify the people who see the risks first, test what these potential Cassandras are saying, then make transparent and explicit decisions about how to deal with the risk.

No comments:

Post a Comment