- The failure to think clearly, or what experts call a “cognitive error,” is a systematic deviation from logic—from optimal, rational, reasonable thought and behavior.
- Cognitive errors are far too engrained to rid ourselves of them completely.
- In daily life, because triumph is made more visible than failure, you systematically overestimate your chances of succeeding.
- The vast number of books and coaches dealing with success should also you make skeptical: The unsuccessful don’t write books or give lectures on their failures.
- Survivorship bias means this: People systematically overestimate their chances of success.
- Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion.
- The human brain seeks patterns and rules. In fact, it takes it one step further: If it finds no familiar patterns, it simply invents some.
- In conclusion: When it comes to pattern recognition, we are oversensitive. Regain your skepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.
- Social proof, sometimes roughly termed the “herd instinct,” dictates that individuals feel they are behaving correctly when they act the same as other people. In other words, the more people who follow a certain idea, the better (truer) we deem the idea to be. And the more people who display a certain behavior, the more appropriate this behavior is judged by others. This is, of course, absurd.
- The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes.
- We find contradictions abominable.
- We find reciprocity in all species whose food supplies are subject to high fluctuations.
- Reciprocity is a very useful survival strategy, a form of risk management. Without it, humanity—and countless species of animals—would be long extinct. It is at the core of cooperation between people (who are not related) and a necessary ingredient for economic growth and wealth creation. There would be no global economy without it—there would be no economy at all. That’s the good side of reciprocity.
- But there is also an ugly side of reciprocity: retaliation. Revenge breeds counter-revenge, and you soon find yourself in a full-scale war.
- The confirmation bias is the mother of all misconceptions. It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs, and convictions. In other words, we filter out any new information that contradicts our existing views (“disconfirming evidence”). This is a dangerous practice.
- If the word “exception” crops up, prick up your ears. Often it hides the presence of disconfirming evidence.
- Whether you go through life believing that “people are inherently good” or “people are inherently bad,” you will find daily proof to support your case.
- To fight against the confirmation bias, try writing down your beliefs—whether in terms of worldview, investments, marriage, health care, diet, career strategies—and set out to find disconfirming evidence.
- Axing beliefs that feel like old friends is hard work but imperative.
- Authorities crave recognition and constantly find ways to reinforce their status.
- Whenever you are about to make a decision, think about which authority figures might be exerting an influence on your reasoning. And when you encounter one in the flesh, do your best to challenge him or her.
- We don’t notice small, gradual changes.
- The availability bias says this: We create a picture of the world using the examples that most easily come to mind. This is idiotic, of course, because in reality, things don’t happen more frequently just because we can conceive of them more easily.
- If something is repeated often enough, it gets stored at the forefront of our minds. It doesn’t even have to be true.
- We prefer wrong information to no information.
- We require others’ input to overcome the availability bias.
- A mere smoke screen, the it’ll-get-worse-before-it-gets-better fallacy is a variant of the so-called confirmation bias. If the problem continues to worsen, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy, and the expert can attribute it to his prowess. Either way he wins.
- In conclusion: If someone says, “It’ll get worse before it gets better,” you should hear alarm bells ringing. But beware: Situations do exist where things first dip, then improve.
- From our own life stories to global events, we shape everything into meaningful stories. Doing so distorts reality and affects the quality of our decisions, but there is a remedy: Pick these apart.
- Whenever you hear a story, ask yourself: Who is the sender, what are his intentions, and what did he hide under the rug?
- The hindsight bias is one of the most prevailing fallacies of all. We can aptly describe it as the “I told you so” phenomenon: In retrospect, everything seems clear and inevitable.
- We systematically overestimate our knowledge and our ability to predict—on a massive scale.
- Experts suffer even more from the overconfidence effect than laypeople do.
- The overconfidence effect is more pronounced in men—women tend not to overestimate their knowledge and abilities as much.
- Be aware that you tend to overestimate your knowledge. Be skeptical of predictions, especially if they come from so-called experts. And with all plans, favor the pessimistic scenario. This way, you have a chance of judging the situation somewhat realistically.
- Unfortunately, it is increasingly difficult to separate true knowledge from chauffeur knowledge
- Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliché generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know. If they find themselves outside their circle of competence, they keep quiet or simply say, “I don’t know.”
- The illusion of control is the tendency to believe that we can influence something over which we have absolutely no sway.
- People respond to incentives by doing what is in their best interests. What is noteworthy is, first, how quickly and radically people’s behavior changes when incentives come into play or are altered, and second, the fact that people respond to the incentives themselves, and not the grander intentions behind them.
- Good incentive systems comprise both intent and reward.
- Poor incentive systems, on the other hand, overlook and sometimes even pervert the underlying aim.
- Keep an eye out for the incentive super-response tendency. If a person’s or an organization’s behavior confounds you, ask yourself what incentive might lie behind it. I guarantee you that you’ll be able to explain 90 percent of the cases this way. What makes up the remaining 10 percent? Passion, idiocy, psychosis, or malice.
- Extreme performances are interspersed with less extreme ones.
- We tend to evaluate decisions based on the result rather than on the decision process. This fallacy is also known as the “historian error.”
- Never judge a decision purely by its result, especially when randomness and “external factors” play a role. A bad result does not automatically indicate a bad decision and vice versa. So rather than tearing your hair out about a wrong decision, or applauding yourself for one that may have only coincidentally led to success, remember why you chose what you did. Were your reasons rational and understandable? Then you would do well to stick with that method, even if you didn’t strike it lucky last time.
- The more choice you have, the more unsure and therefore dissatisfied you are afterward.
- The liking bias is startlingly simple to understand and yet we continually fall prey to it. It means this: The more we like someone, the more inclined we are to buy from or help that person.
- We consider things to be more valuable the moment we own them.
- Improbable coincidences are precisely that: rare but very possible events. It’s not surprising when they finally happen. What would be more surprising is if they never came to be.
- If you ever find yourself in a tight, unanimous group, you must speak your mind, even if your team does not like it. Question tacit assumptions, even if you risk expulsion from the warm nest. And, if you lead a group, appoint someone as devil’s advocate.
- we respond to the expected magnitude of an event (the size of the jackpot or the amount of electricity), but not to its likelihood. In other words: We lack an intuitive grasp of probability.
- We have no intuitive grasp of risk and thus distinguish poorly among different threats. The more serious the threat and the more emotional the topic (such as radioactivity), the less reassuring a reduction in risk seems to us.
- Rara sunt cara, said the Romans. Rare is valuable. In fact, the scarcity error is as old as mankind.
- When we are deprived of an option, we suddenly deem it more attractive. It is a kind of act of defiance.
- In conclusion: The typical response to scarcity is a lapse in clear thinking.
- For most people, survivorship bias (chapter 1) is one of the causes for their base-rate neglect. They tend to see only the successful individuals and companies because the unsuccessful cases are not reported (or underreported). This makes them neglect the large part of the “invisible” cases.
- people believe in the “balancing force of the universe.” This is the gambler’s fallacy.
- Purely independent events really only exist at the casino, in the lottery, and in theory. In real life, in the financial markets and in business, with the weather and your health, events are often interrelated.
- Inductive thinking can have devastating results. Yet we cannot do without it.
- To assume that our existence to date is an indication of our future survival is a serious flaw in reasoning. Probably the most serious of all.
- if you want to convince someone about something, don’t focus on the advantages; instead highlight how it helps them dodge the disadvantages.
- Management gurus push employees in large companies to be bolder and more entrepreneurial. The reality is: Employees tend to be risk averse.
- In almost all companies and situations, safeguarding your career trumps any potential reward.
- We can’t fight it: Evil is more powerful and more plentiful than good. We are more sensitive to negative than to positive things.
- Social loafing is rational behavior: Why invest all of your energy when half will do—especially when this little shortcut goes unnoticed?
- When people work together, individual performances decrease.
- In conclusion: People behave differently in groups than when alone (otherwise there would be no groups). The disadvantages of groups can be mitigated by making individual performances as visible as possible. Long live meritocracy! Long live the performance society!
- Linear growth we understand intuitively. However, we have no sense of exponential (or percentage) growth. Why is this? Because we didn’t need it before. Our ancestors’ experiences were mostly of the linear variety.
- In the Stone Age, people rarely came across exponential growth. Today, things are different.
- Nothing that grows exponentially grows forever. Most politicians, economists, and journalists forget that. Such growth will eventually reach a limit. Guaranteed.
- When it comes to growth rates, do not trust your intuition. You don’t have any. Accept it. What really helps is a calculator or, with low growth rates, the magic number of 70.
- The highest bid at an auction is often much too high—unless these bidders have critical information others are not privy to.
- Astoundingly, more than half of all acquisitions destroy value, according to a McKinsey study.
- Economic success depends far more on the overall economic climate and the industry’s attractiveness than on brilliant leadership.
- false causality leads us astray practically every day.
- Correlation is not causality.
- Take a closer look at linked events: Sometimes what is presented as the cause turns out to be the effect, and vice versa.
- The halo effect occurs when a single aspect dazzles us and affects how we see the full picture.
- The halo effect always works the same way: We take a simple-to-obtain or remarkable fact or detail, such as a company’s financial situation, and extrapolate conclusions from there that are harder to nail down, such as the merit of its management or the feasibility of its strategy.
- Alternative paths are all the outcomes that could have happened but did not.
- Risk is not directly visible. Therefore, always consider what the alternatives paths are.
- The problem is that experts enjoy free rein with few negative consequences. If they strike it lucky, they enjoy publicity, consultancy offers, and publication deals. If they are completely off the mark, they face no penalties—neither in terms of financial compensation nor in loss of reputation. This win-win scenario virtually incentivizes them to churn out as many prophecies as they can muster. Indeed, the more forecasts they generate, the more will be coincidentally correct.
- Kahneman believes that two types of thinking exist: The first kind is intuitive, automatic, and direct. The second is conscious, rational, slow, laborious, and logical. Unfortunately, intuitive thinking draws conclusions long before the conscious mind does.
- With important decisions, remember that, at the intuitive level, we have a soft spot for plausible stories. Therefore, be on the lookout for convenient details and happy endings.
- Remember: If an additional condition has to be met, no matter how plausible it sounds, it will become less, not more, likely.
- it’s not what you say but how you say it. If a message is communicated in different ways, it will also be received in different ways. In psychologists’ jargon, this technique is called framing.
- We react differently to identical situations, depending on how they are presented.
- Realize that whatever you communicate contains some element of framing, and that every fact—even if you hear it from a trusted friend or read it in a reputable newspaper—is subject to this effect, too.
- This is the action bias: Look active, even if it achieves nothing.
- Society at large still prefers rash action to a sensible wait-and-see strategy.
- In conclusion: In new or shaky circumstances, we feel compelled to do something, anything. Afterward we feel better, even if we have made things worse by acting too quickly or too often. So, though it might not merit a parade in your honor, if a situation is unclear, hold back until you can assess your options.
- Deliberate inaction somehow seems less grave than a comparable action—
- studies show that commuting by car represents a major source of discontent and stress, and people hardly ever get used to it.
- People who change or progress in their careers are, in terms of happiness, right back where they started after around three months.
- Science calls this effect the hedonic treadmill: We work hard, advance, and are able to afford more and nicer things, and yet this doesn’t make us any happier.
- Use these scientifically rubber-stamped pointers to make better, brighter decisions: (a) Avoid negative things that you cannot grow accustomed to, such as commuting, noise, or chronic stress. (b) Expect only short-term happiness from material things, such as cars, houses, lottery winnings, bonuses, and prizes. (c) Aim for as much free time and autonomy as possible since long-lasting positive effects generally come from what you actively do. Follow your passions even if you must forfeit a portion of your income for them. Invest in friendships. For most people, professional status achieves long-lasting happiness, as long as they don’t change peer groups at the same time.
- Our brain is a connection machine.
- But how do you tell the difference between beginner’s luck and the first signs of real talent? There is no clear rule, but these two tips may help: First, if you are much better than others over a long period of time, you can be fairly sure that talent plays a part. (Unfortunately, you can never be 100 percent, though.) Second, the more people competing, the greater the chances are that one of them will repeatedly strike lucky.
- We place huge value on immediacy—much more than is justifiable. “Enjoy each day to the fullest and don’t worry about tomorrow” is simply not a smart way to live.
- Put plainly: The closer a reward is, the higher our “emotional interest rate” rises and the more we are willing to give up in exchange for it.
- Animals will never turn down an instant reward in order to attain more in the future.
- When you justify your behavior, you encounter more tolerance and helpfulness. It seems to matter very little if your excuse is good or not. Using the simple validation “because” is sufficient.
- Making decisions is exhausting.
- The contagion bias describes how we are incapable of ignoring the connection we feel to certain items—be they from long ago or only indirectly related (as with the photos).
- Dealing in averages is a risky undertaking because they often mask the underlying distribution—the way the values stack up.
- That’s a power law. A few extremes dominate the distribution, and the concept of average is rendered worthless.
- In conclusion: If someone uses the word “average,” think twice. Try to work out the underlying distribution.
- The point is: Small—surprisingly small—monetary incentives crowd out other types of incentives.
- Financial reward erodes any other motivations.
- “You would not believe how difficult it is to be simple and clear. People are afraid that they may be seen as a simpleton. In reality, just the opposite is true.”
- In conclusion: Verbal expression is the mirror of the mind. Clear thoughts become clear statements, whereas ambiguous ideas transform into vacant ramblings.
- The world is complicated, and it takes a great deal of mental effort to understand even one facet of the whole.
- Borges’s map is the extreme case of the information bias, the delusion that more information guarantees better decisions.
- Additional information not only wastes time and money, it can also put you at a disadvantage.
- Forget trying to amass all the data. Do your best to get by with the bare facts. It will help you make better decisions. Superfluous knowledge is worthless, whether you know it or not.
- When you put a lot of energy into a task, you tend to overvalue the result.
- Effort justification is a special case of “cognitive dissonance.”
- Groups use effort justification to bind members to them—for example, through initiation rites.
- Whenever you have invested a lot of time and effort into something, stand back and examine the result—only the result.
- Every investor knows it’s impossible to forecast financial results accurately.
- When expectations are fueled in the run-up to an announcement, any disparity gives rise to draconian punishment, regardless of how paltry the gap is.
- Expectations are intangible, but their effect is quite real. They have the power to change reality.
- As paradoxical as it sounds: The best way to shield yourself from nasty surprises is to anticipate them.
- Not everything that seems plausible is true. Reject the easy answers that pop into your head.
- People tend to identify many of their own traits in such universal descriptions. Science labels this tendency the Forer effect (or the “Barnum effect”). The Forer effect explains why the pseudosciences work so well—
- We accept whatever corresponds to our self-image and unconsciously filter everything else out.
- It is hard to imagine a storm of the century if you’re only thirty years old.
- We are the descendants of quick decision makers, and we rely on mental shortcuts called heuristics.
- Seemingly insignificant factors influence our emotions.
- Whether we like it or not, we are puppets of our emotions. We make complex decisions by consulting our feelings, not our thoughts.
- When we soul-search, we contrive the findings. The belief that reflection leads to truth or accuracy is called the introspection illusion.
- Because we are so confident of our beliefs, we experience three reactions when someone fails to share our views. Response 1: Assumption of ignorance.
- 2: Assumption of idiocy.
- 3: Assumption of malice.
- In conclusion: Nothing is more convincing than your own beliefs. We believe that introspection unearths genuine self-knowledge. Unfortunately, introspection is, in large part, fabrication posing two dangers: First, the introspection illusion creates inaccurate predictions of future mental states.
- Second, we believe that our introspections are more reliable than those of others, which creates an illusion of superiority.
- Be all the more critical with yourself. Regard your internal observations with the same skepticism as claims from some random person. Become your own toughest critic.
- We mere mortals do everything we can to keep open the maximum number of options.
- Each decision costs mental energy and eats up precious time for thinking and living.
- We are obsessed with having as many irons as possible in the fire, ruling nothing out, and being open to everything. However, this can easily destroy success.
- Assume that most of the technology that has existed for the past fifty years will serve us for another half century. And assume that recent technology will be passé in a few years’ time.
- Old technology has proven itself; it possesses an inherent logic even if we do not always understand it. If something has endured for epochs, it must be worth its salt.
- MBA programs attract career-oriented people who will probably earn above-average salaries at some stage of their careers, even without the extra qualification of an MBA.
- We systematically forget to compare an existing offer with the next-best alternative.
- Recommendation: Hire people who are better than you, otherwise you soon preside over a pack of underdogs.
- The inept are gifted at overlooking the extent of their incompetence.
- The more recent the information, the better we remember it. This occurs because our short-term memory file drawer, as it were, contains very little extra space. When a new piece of information gets filed, an older piece of information is discarded to make room.
- NIH syndrome causes you to fall in love with your own ideas.
- We overlook shrewd ideas simply because they come from other cultures.
- A Black Swan is an unthinkable event that massively affects your life, your career, your company, your country. There are positive and negative Black Swans.
- There are things we know (“known facts”), there are things we do not know (“known unknowns”), and there are things we do not know that we do not know (“unknown unknowns”).
- Why are Black Swans important? Because, as absurd as it may sound, they are cropping up more and more frequently and they tend to become more consequential.
- Since probabilities cannot fall below zero, and our thought processes are prone to error, you should assume that everything has an above-zero probability.
- Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not.
- The conclusion: Insights do not pass well from one field to another. This effect is called domain dependence.
- Business is teeming with domain dependence
- What you master in one area is difficult to transfer to another.
- Book smarts don’t transfer to street smarts easily.
- We frequently overestimate unanimity with others, believing that everyone else thinks and feels exactly like we do. This fallacy is called the false-consensus effect.
- If people do not share our opinions, we categorize them as “abnormal.”
- Social proof is an evolutionary survival strategy. Following the crowd has saved our butts more often in the past hundred thousand years than striking out on our own.
- It is safe to assume that half of what you remember is wrong. Our memories are riddled with inaccuracies, including the seemingly flawless flashbulb memories. Our faith in them can be harmless—or lethal.
- Prejudice and aversion are biological responses to anything foreign. Identifying with a group has been a survival strategy for hundreds of thousands of years.
- Identifying with a group distorts your view of the facts.
- The Ellsberg Paradox offers empirical proof that we favor known probabilities (box A) over unknown ones (box B).
- Risk means that the probabilities are known. Uncertainty means that the probabilities are unknown.
- You can make calculations with risk, but not with uncertainty.
- To avoid hasty judgment, you must learn to tolerate ambiguity.
- People crave what they know. Given the choice of trying something new or sticking to the tried-and-tested option, we tend to be highly conservative, even if a change would be beneficial.
- Regret is the feeling of having made the wrong decision.
- Salience refers to a prominent feature, a stand-out attribute, a particularity, something that catches your eye. The salience effect ensures that outstanding features receive much more attention than they deserve.
- The salience effect influences not only how we interpret the past but also how we imagine the future.
- Even professional analysts cannot always evade the salience effect.
- Salient information has an undue influence on how you think and act. We tend to neglect hidden, slow-to-develop, discreet factors. Do not be blinded by irregularities.
- Gather enough mental energy to fight against seemingly obvious explanations.
- We treat money that we win, discover, or inherit much more frivolously than hard-earned cash.
- Be careful if you win money or if a business gives you something for free. Chances are you will pay it back with interest out of sheer exuberance.
20171006
THE ART OF THINKING CLEARLY by Rolf Dobelli
Labels:
books
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment