- We are all forecasters.
- It turns out that forecasting is not a “you have it or you don’t” talent. It is a skill that can be cultivated.
- Our desire to reach into the future will always exceed our grasp.
- It is generally true that the further we try to look into the future, the harder it is to see.
- Laws of physics aside, there are no universal constants, so separating the predictable from the unpredictable is difficult work. There’s no way around it.
- “I have been struck by how important measurement is to improving the human condition,” Bill Gates wrote. “You can achieve incredible progress if you set a clear goal and find a measure that will drive progress toward that goal….This may seem basic, but it is amazing how often it is not done and how hard it is to get right.”
- Superforecasting does require minimum levels of intelligence, numeracy, and knowledge of the world, but anyone who reads serious books about psychological research probably has those prerequisites.
- superforecasting demands thinking that is open-minded, careful, curious, and—above all—self-critical. It also demands focus.
- The point is now indisputable: when you have a well-validated statistical algorithm, use it.
- Until quite recently in historical terms, it was not unusual for a sick person to be better off if there was no physician available because letting an illness take its natural course was less dangerous than what a physician would inflict.
- Randomized controlled trials are now routine. Yet it was revolutionary because medicine had never before been scientific.
- It is natural to identify our thinking with the ideas, images, plans, and feelings that flow through consciousness.
- In describing how we think and decide, modern psychologists often deploy a dual-system model that partitions our mental universe into two domains. System 2 is the familiar realm of conscious thought. It consists of everything we choose to focus on. By contrast, System 1 is largely a stranger to us. It is the realm of automatic perceptual and cognitive operations—like those you are running right now to transform the print on this page into a meaningful sentence or to hold the book while reaching for a glass and taking a sip. We have no awareness of these rapid-fire processes but we could not function without them. We would shut down.
- Conscious thought is demanding.
- Gathering all evidence and mulling it over may be the best way to produce accurate answers, but a hunter-gatherer who consults statistics on lions before deciding whether to worry about the shadow moving in the grass isn’t likely to live long enough to bequeath his accuracy-maximizing genes to the next generation. Snap judgments are sometimes essential.
- A defining feature of intuitive judgment is its insensitivity to the quality of the evidence on which the judgment is based.
- Of course, System 1 can’t conclude whatever it wants. The human brain demands order. The world must make sense, which means we must be able to explain what we see and think. And we usually can—because we are creative confabulators hardwired to invent stories that impose coherence on the world.
- Sane people are expected to have sensible-sounding reasons for their actions.
- in science, the best evidence that a hypothesis is true is often an experiment designed to prove the hypothesis is false, but which fails to do so.
- As the post-Oslo speculation reveals, our natural inclination is to grab on to the first plausible explanation and happily gather supportive evidence without checking its reliability. That is what psychologists call confirmation bias. We rarely seek out evidence that undercuts our first explanation, and when that evidence is shoved under our noses we become motivated skeptics—finding reasons, however tenuous, to belittle it or throw it out entirely.
- Formally, it’s called attribute substitution, but I call it bait and switch: when faced with a hard question, we often surreptitiously replace it with an easy one.
- There is nothing mystical about an accurate intuition like the fire commander’s. It’s pattern recognition. With training or experience, people can encode patterns deep in their memories in vast number and intricate detail—such as the estimated fifty thousand to one hundred thousand chess positions that top players have in their repertoire.
- If something doesn’t fit a pattern—like a kitchen fire giving off more heat than a kitchen fire should—a competent expert senses it immediately.
- Our pattern-recognition ability comes at the cost of susceptibility to false positives.
- Whether intuition generates delusion or insight depends on whether you work in a world full of valid cues you can unconsciously register for future use.
- The tip-of-your-nose perspective can work wonders but it can also go terribly awry, so if you have the time to think before making a big decision, do so—and be prepared to accept that what seems obviously true now may turn out to be false later.
- The first step in learning what works in forecasting, and what doesn’t, is to judge forecasts, and to do that we can’t make assumptions about what the forecast means. We have to know.
- The first step in learning what works in forecasting, and what doesn’t, is to judge forecasts, and to do that we can’t make assumptions about what the forecast means.
- The truth is, the truth is elusive.
- Judging forecasts is much harder than often supposed, a lesson I learned the hard way—from extensive and exasperating experience.
- Obviously, a forecast without a time frame is absurd. And yet, forecasters routinely make them,
- But it was never adopted. People liked clarity and precision in principle but when it came time to make clear and precise forecasts they weren’t so keen on numbers. Some said it felt unnatural or awkward, which it does when you’ve spent a lifetime using vague language, but that’s a weak argument against change.
- But hopelessly vague language is still so common, particularly in the media, that we rarely notice how vacuous it is. It just slips by.
- Almost anything “may” happen.
- If we are serious about measuring and improving, this won’t do. Forecasts must have clearly defined terms and timelines. They must use numbers. And one more thing is essential: we must have lots of forecasts.
- We cannot rerun history so we cannot judge one probabilistic forecast—but everything changes when we have many probabilistic forecasts.
- Two ways to be miscalibrated: underconfident (over the line) and overconfident (under the line)
- The many forecasts required for calibration calculations make it impractical to judge forecasts about rare events, and even with common events it means we must be patient data collectors—and cautious data interpreters.
- Beating the average consistently requires rare skill.
- How well aggregation works depends on what you are aggregating.
- All too often we agree. We don’t consider alternative views—even when it’s clear that we should.
- People can and do think differently in different circumstances—cool and calculating at work, perhaps, but intuitive and impulsive when shopping. And our thinking habits are not immutable. Sometimes they evolve without our awareness of the change. But we can also, with effort, choose to shift gears from one mode to another.
- Models are supposed to simplify things, which is why even the best are flawed. But they’re necessary. Our minds are full of models. We couldn’t function without them.
- The job of intelligence is to speak truth to power, not to tell the politicians temporarily in charge what they want to hear,
- The IC is a huge bureaucracy that responds slowly even to the shock of major failures.
- When you combine the judgments of a large group of people to calculate the “wisdom of the crowd” you collect all the relevant information that is dispersed among all those people.
- it’s easy to misinterpret randomness. We don’t have an intuitive feel for it.
- Most things in life involve skill and luck, in varying proportions.
- Knowledge is something we can all increase, but only slowly. People who haven’t stayed mentally active have little hope of catching up to lifelong learners.
- High-powered pattern recognition skills won’t get you far, though, if you don’t know where to look for patterns in the real world.
- When we make estimates, we tend to start with some number and adjust. The number we start with is called the anchor. It’s important because we typically underadjust, which means a bad anchor can easily produce a bad estimate. And it’s astonishingly easy to settle on a bad anchor.
- Researchers have found that merely asking people to assume their initial judgment is wrong, to seriously consider why that might be, and then make another judgment, produces a second estimate which, when combined with the first, improves accuracy almost as much as getting a second estimate from another person. The same effect was produced simply by letting several weeks pass before asking people to make a second estimate.
- There is an even simpler way of getting another perspective on a question: tweak its wording.
- For superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded.
- The great majority of their forecasts are simply the product of careful thought and nuanced judgment.
- A smart executive will not expect universal agreement, and will treat its appearance as a warning flag that groupthink has taken hold. An array of judgments is welcome proof that the people around the table are actually thinking for themselves and offering their unique perspectives.
- Human beings have coped with uncertainty for as long as we have been recognizably human. And for almost all that time we didn’t have access to statistical models of uncertainty because they didn’t exist.
- A confident yes or no is satisfying in a way that maybe never is, a fact that helps to explain why the media so often turn to hedgehogs who are sure they know what is coming no matter how bad their forecasting records may be.
- Confidence and accuracy are positively correlated. But research shows we exaggerate the size of the correlation.
- Scientific facts that look as solid as rock to one generation of scientists can be crushed to dust beneath the advances of the next. All scientific knowledge is tentative. Nothing is chiseled in granite.
- An awareness of irreducible uncertainty is the core of probabilistic thinking, but it’s a tricky thing to measure.
- Epistemic uncertainty is something you don’t know but is, at least in theory, knowable.
- Aleatory uncertainty is something you not only don’t know; it is unknowable.
- Aleatory uncertainty ensures life will always have surprises, regardless of how carefully we plan. Superforecasters grasp this deep truth better than most.
- Meaning is a basic human need.
- Science doesn’t tackle “why” questions about the purpose of life. It sticks to “how” questions that focus on causation and probabilities.
- So finding meaning in events is positively correlated with wellbeing but negatively correlated with foresight. That sets up a depressing possibility: Is misery the price of accuracy?
- A forecast that is updated to reflect the latest available information is likely to be closer to the truth than a forecast that isn’t so informed.
- Superforecasters update much more frequently, on average, than regular forecasters.
- To be a top-flight forecaster, a growth mindset is essential.
- We learn new skills by doing. We improve those skills by doing more. These fundamental facts are true of even the most demanding skills.
- To learn from failure, we must know when we fail.
- Vague language is elastic language.
- Once we know the outcome of something, that knowledge skews our perception of what we thought before we knew the outcome: that’s hindsight bias.
- To get better at a certain type of forecasting, that is the type of forecasting you must do—over and over again, with good feedback telling you how your training is going, and a cheerful willingness to say, “Wow, I got that one wrong. I’d better think about why.”
- Superforecasters are perpetual beta.
- They [superforecasters] have a philosophic outlook, they tend to be:
- CAUTIOUS: Nothing is certain
- HUMBLE: Reality is infinitely complex
- NONDETERMINISTIC: What happens is not meant to be and does not have to happen
- In their abilities and thinking styles, they tend to be:
- ACTIVELY OPEN-MINDED: Beliefs are hypotheses to be tested, not treasures to be protected
- INTELLIGENT AND KNOWLEDGEABLE, WITH A “NEED FOR COGNITION”: Intellectually curious, enjoy puzzles and mental challenges
- REFLECTIVE: Introspective and self-critical
- NUMERATE: Comfortable with numbers
- In their methods of forecasting they tend to be:
- PRAGMATIC: Not wedded to any idea or agenda
- ANALYTICAL: Capable of stepping back from the tip-of-your-nose perspective and considering other views
- DRAGONFLY-EYED: Value diverse views and synthesize them into their own
- PROBABILISTIC: Judge using many grades of maybe
- THOUGHTFUL UPDATERS: When facts change, they change their minds
- GOOD INTUITIVE PSYCHOLOGISTS: Aware of the value of checking thinking for cognitive and emotional biases
- In their work ethic, they tend to have:
- A GROWTH MINDSET: Believe it’s possible to get better
- GRIT: Determined to keep at it however long it takes
- The strongest predictor of rising into the ranks of superforecasters is perpetual beta, the degree to which one is committed to belief updating and self-improvement.
- A leader “needs to figure out what’s the right move and then execute it boldly.”
- The humility required for good judgment is not self-doubt—the sense that you are untalented, unintelligent, or unworthy. It is intellectual humility. It is a recognition that reality is profoundly complex, that seeing things clearly is a constant struggle, when it can be done at all, and that human judgment must therefore be riddled with mistakes.
- The “black swan” is therefore a brilliant metaphor for an event so far outside experience we can’t even imagine it until it happens.
- Fuzzy thinking can never be proven wrong.
- Far too many people treat numbers like sacred totems offering divine insight. The truly numerate know that numbers are tools, nothing more, and their quality can range from wretched to superb.
- Numbers must be constantly scrutinized and improved, which can be an unnerving process because it is unending. Progressive improvement is attainable. Perfection is not.
- Foresight is one element of good judgment, but there are others, including some that cannot be counted and run through a scientist’s algorithms—moral judgment, for one.
- Another critical dimension of good judgment is asking good questions.
- Triage.
- Focus on questions where your hard work is likely to pay off.
- Concentrate on questions in the Goldilocks zone of difficulty, where effort pays off the most.
- Certain classes of outcomes have well-deserved reputations for being radically unpredictable (e.g., oil prices, currency markets). But we usually don’t discover how unpredictable outcomes are until we have spun our wheels for a while trying to gain analytical traction.
- Break seemingly intractable problems into tractable sub-problems.
- Decompose the problem into its knowable and unknowable parts. Flush ignorance into the open. Expose and examine your assumptions. Dare to be wrong by making your best guesses.
- Strike the right balance between inside and outside views.
- Superforecasters are in the habit of posing the outside-view question: How often do things of this sort happen in situations of this sort?
- Strike the right balance between under- and overreacting to evidence.
- Belief updating is to good forecasting as brushing and flossing are to good dental hygiene. It can be boring, occasionally uncomfortable, but it pays off in the long term.
- Look for the clashing causal forces at work in each problem.
- For every good policy argument, there is typically a counterargument that is at least worth acknowledging.
- Strive to distinguish as many degrees of doubt as the problem permits but no more.
- Few things are either certain or impossible. And “maybe” isn’t all that informative. So your uncertainty dial needs more than three settings. Nuance matters. The more degrees of uncertainty you can distinguish, the better a forecaster you are likely to be.
- Strike the right balance between under- and overconfidence, between prudence and decisiveness.
- Look for the errors behind your mistakes but beware of rearview-mirror hindsight biases.
- Don’t try to justify or excuse your failures. Own them! Conduct unflinching postmortems: Where exactly did I go wrong?
- Not all successes imply that your reasoning was right. You may have just lucked out by making offsetting errors.
- Bring out the best in others and let others bring out the best in you.
- Master the fine arts of team management, especially perspective taking (understanding the arguments of the other side so well that you can reproduce them to the other’s satisfaction), precision questioning (helping others to clarify their arguments so they are not misunderstood), and constructive confrontation (learning to disagree without being disagreeable).
- Master the error-balancing bicycle.
- Learning requires doing, with good feedback that leaves no ambiguity about whether you are succeeding—“I’m rolling along smoothly!”—or whether you are failing—“crash!”
- Like all other known forms of expertise, superforecasting is the product of deep, deliberative practice.
- Don’t treat commandments as commandments
- Guidelines are the best we can do in a world where nothing is certain or exactly repeatable.
20171117
SUPERFORECASTING by Philip E. Tetlock, Dan Gardner
Labels:
books
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment