Pages

20171116

MINDWARE by Richard E. Nisbett


  • Sometimes commonsense approaches to problems produce errors in judgment and unfortunate actions.
  • The mind is like a muscle in some ways but not in others. Lifting pretty much anything will make you stronger. But thinking about just anything in any old way is not likely to make you smarter.
  • The key is learning how to frame events in such a way that the relevance of the principles to the solutions of particular problems is made clear, and learning how to code events in such a way that the principles can actually be applied to the events.
  • Understanding what we can and can’t observe about our mental life tells us when to rely on intuition when solving a problem and when to turn to explicit rules about categorization, choice, or assessment of causal explanations.
  • When we look at a bird or a chair or a sunset, it feels as if we’re simply registering what is in the world. But in fact our perceptions of the physical world rely heavily on tacit knowledge, and mental processes we’re unaware of, that help us perceive something or accurately categorize it.
  • It’s more unsettling to learn that our understanding of the nonmaterial world, including our beliefs about the characteristics of other people, is also utterly dependent on stored knowledge and hidden reasoning processes.
  • Since the 1920s, psychologists have made much use of the schema concept. The term refers to cognitive frameworks, templates, or rule systems that we apply to the world to make sense of it.
  • We have schemas for virtually every kind of thing we encounter.
  • Schemas affect our behavior as well as our judgments.
  • A serious problem with our reliance on schemas and stereotypes is that they can get triggered by incidental facts that are irrelevant or misleading.
  • Any stimulus we encounter will trigger spreading activation to related mental concepts.
  • our construal of the nature and meaning of events is massively dependent on stored schemas and the inferential processes they initiate and guide.
  • All “reality” is merely an arbitrary construal of the world. This view has a long history. Right now its advocates tend to call themselves “postmodernists” or “deconstructionists.” Many people answering to these labels endorse the idea that the world is a “text” and no reading of it can be held to be any more accurate than any other.
  • Spreading activation makes us susceptible to all kinds of unwanted influences on our judgments and behavior.
  • The most obvious implication of all the evidence about the importance of incidental stimuli is that you want to rig environments so that they include stimuli that will make you or your product or your policy goals attractive.
  • Less obvious are two facts: (1) The effect of incidental stimuli can be huge, and (2) you want to know as much as you possibly can about what kinds of stimuli produce what kinds of effects.
  • Our construal of objects and events is influenced not just by the schemas that are activated in particular contexts, but by the framing of judgments we have to make.
  • We often arrive at judgments or solve problems by use of heuristics—rules of thumb that suggest a solution to a problem.
  • The conjunction of two events can’t be more likely than just one event by itself.
  • Simply put, we see patterns in the world where there are none because we don’t understand just how un-random-looking random sequences can be.
  • The best predictor of future behavior is past behavior. You’re rarely going to do better than that.
  • Remember that all perceptions, judgments, and beliefs are inferences and not direct readouts of reality.
  • Be aware that our schemas affect our construals.
  • Remember that incidental, irrelevant perceptions and cognitions can affect our judgment and behavior.
  • Be alert to the possible role of heuristics in producing judgments.
  • The fundamental attribution error gets us in trouble constantly.
  • We should choose our acquaintances carefully because we’re going to be highly influenced by them. This is especially true for young people: the younger you are, the more influenced you are by peers’ attitudes and behaviors.
  • One of a parent’s most important and challenging roles is to make sure their children’s acquaintances are likely to be good influences.
  • self-serving motives behind such attributions. But it’s important to know that people generally think that their own behavior is largely a matter of responding sensibly to the situation they happen to be in—whether that behavior is admirable or abominable.
  • there is vastly more going on in our heads than we realize.
  • Pay more attention to context. This will improve the odds that you’ll correctly identify situational factors that are influencing your behavior and that of others.
  • Realize that situational factors usually influence your behavior and that of others more than they seem to, whereas dispositional factors are usually less influential than they seem.
  • Realize that other people think their behavior is more responsive to situational factors than you’re inclined to think—and they’re more likely to be right than you are.
  • Recognize that people can change.
  • Change the environment and you change the person.
  • We generally feel that we’re fairly knowledgeable about what’s going on in our heads—what it is we’re thinking about and what thinking processes are going on. But an absolute gulf separates this belief from reality.
  • Don’t assume that you know why you think what you think or do what you do.
  • Don’t assume that other people’s accounts of their reasons or motives are any more likely to be right than are your accounts of your own reasons or motives.
  • Consciousness is necessary for checking and elaborating on conclusions reached by the unconscious mind.
  • The most important thing I have to tell you—in this whole book—is that you should never fail to take advantage of the free labor of the unconscious mind.
  • If you’re not making progress on a problem, drop it and turn to something else.
  • Microeconomists are not agreed on just how it is that people make decisions or how they should make them. They do agree, however, that cost-benefit analysis of some kind is what people normally do, and should do.
  • The more important and complicated the decision, the more important it is to do such an analysis.
  • Even an obviously flawed cost-benefit analysis can sometimes show in high relief what the decision must be.
  • There is no fully adequate metric for costs and benefits, but it’s usually necessary to compare them anyway.
  • Calculations of the value of a human life are repellent and sometimes grossly misused, but they are often necessary nonetheless in order to make sensible policy decisions.
  • Tragedies of the commons, where my gain creates negative externalities for you, typically require binding and enforceable intervention.
  • Expended resources that can’t be retrieved should not be allowed to influence a decision about whether to consume something that those resources were used to obtain.
  • You should avoid engaging in an activity that has lower net benefit than some other action you could take now or in the future.
  • Falling into the sunk cost trap always entails paying unnecessary opportunity costs.
  • Attention to costs and benefits, including sunk cost and opportunity cost traps, pays.
  • Loss considerations tend to loom too large relative to gain considerations. Loss aversion causes us to miss out on a lot of good deals.
  • We’re overly susceptible to the endowment effect—valuing a thing more than we should simply because it’s ours.
  • We’re a lazy species: we hang on to the status quo for no other reason than that it’s the way things are.
  • Choice is way overrated. Too many choices can confuse and make decisions worse—or prevent needed decisions from being made.
  • When we try to influence the behavior of others, we’re too ready to think in terms of conventional incentives—carrots and sticks.
  • Rather than pushing people or pulling people, try removing barriers and creating channels that make the most sensible behavior the easiest option.
  • you simply can’t live an optimal life in today’s world without basic knowledge of statistics.
  • Observations of objects or events should often be thought of as samples of a population.
  • The fundamental attribution error is primarily due to our tendency to ignore situational factors, but this is compounded by our failure to recognize that a brief exposure to a person constitutes a small sample of a person’s behavior.
  • Increasing sample size reduces error only if the sample is unbiased.
  • The standard deviation is a handy measure of the dispersion of a continuous variable around the mean.
  • If we know that an observation of a particular kind of variable comes from the extreme end of the distribution of that variable, then it’s likely that additional observations are going to be less extreme.
  • Accurate assessment of relationships can be remarkably difficult.
  • When we try to assess correlations for which we have no anticipations, as when we try to estimate the correlation between meaningless or arbitrarily paired events, the correlation must be very high for us to be sure of detecting it.
  • We’re susceptible to illusory correlations.
  • The representativeness heuristic underlies many of our prior assumptions about correlation.
  • Correlation doesn’t establish causation, but if there’s a plausible reason why A might cause B, we readily assume that correlation does indeed establish causation.
  • Reliability refers to the degree to which a case gets the same score on two occasions or when measured by different means. Validity refers to the degree to which a measure predicts what it’s supposed to predict.
  • The more codable events are, the more likely it is that our assessments of correlation will be correct.
  • Caution and humility are called for when we try to predict future trait-related behavior from past trait-related behavior unless our sample of behavior is large and obtained in a variety of situations.
  • Reminding ourselves of the concept of the fundamental attribution error may help us to realize that we may be overgeneralizing.
  • Institutions increasingly rely on experiments to provide them with information. That’s a good thing, because if you can do an experiment to answer a question, it’s nearly always going to be better than correlational techniques.
  • Assumptions tend to be wrong. And even if they didn’t, it’s silly to rely on them whenever it’s easy to test them.
  • Correlational designs are weak because the researcher hasn’t assigned the cases to their condition.
  • The greater the number of cases—people, agricultural plots, and so on—the greater the likelihood that you’ll find a real effect and the lower the likelihood that you will “find” an effect that isn’t there.
  • When you assign each case to all of the possible treatments, your design is more sensitive.
  • It’s crucial to consider whether the cases you’re examining (people in the case of research on humans) could influence one another.
  • Sometimes we can observe relationships that come close to being as convincing as a genuine experiment.
  • The randomized control experiment is frequently called the gold standard in scientific and medical research—with good reason. Results from such studies trump results from any and all other kinds of studies.
  • Society pays a high cost for experiments not carried out.
  • Multiple regression analysis (MRA) examines the association between an independent variable and a dependent variable,
  • The fundamental problem with MRA, as with all correlational methods, is self-selection.
  • When a competently conducted experiment tells you one thing about a given relationship and MRA tells you another, you normally must believe the experiment.
  • A basic problem with MRA is that it typically assumes that the independent variables can be regarded as building blocks, with each variable taken by itself being logically independent of all the others.
  • Just as correlation doesn’t prove causation, absence of correlation fails to prove absence of causation.
  • Verbal reports are susceptible to a huge range of distortions and errors.
  • Answers to questions about attitudes are frequently based on tacit comparison with some reference group.
  • Reports about the causes of our behavior, as you learned in Chapter 3 and were reminded of in this chapter, are susceptible to a host of errors and incidental influences.
  • Actions speak louder than words. Behavior is a better guide to understanding people’s attitudes and personalities than are verbal responses.
  • Conduct experiments on yourself. The same methodologies that psychologists use to study people can be used to study yourself.
  • If the structure of your argument can be mapped directly onto one of the valid forms of argument that logic specifies, you’re guaranteed a deductively valid conclusion.
  • Logic divests arguments of any references to the real world so that the formal structure of an argument can be laid bare without any interference from prior beliefs.
  • The truth of a conclusion and the validity of a conclusion are entirely separate things.
  • Venn diagrams embody syllogistic reasoning and can be helpful or even necessary for solving some categorization problems.
  • Errors in deductive reasoning are sometimes made because they map onto argument forms that are inductively valid.
  • Pragmatic reasoning schemas are abstract rules of reasoning that underlie much of thought.
  • Some of the fundamental principles underlying Western and Eastern thought are different. Western thought is analytic and emphasizes logical concepts of identity and insistence on noncontradiction; Eastern thought is holistic and encourages recognition of change and acceptance of contradiction.
  • Western thought encourages separation of form from content in order to assess validity of arguments.
  • Eastern thought produces more accurate beliefs about some aspects of the world and the causes of human behavior than Western thought.
  • Westerners and Easterners respond in quite different ways to contradictions between two propositions.
  • Eastern and Western approaches to history are very different.
  • Western thought has been influenced substantially by Eastern thought in recent decades.
  • Reasoning about social conflict by younger Japanese is wiser than that of younger Americans. But Americans gain in wisdom over their life span and Japanese do not.
  • Epistemology is the study of what counts as knowledge, how we can best obtain knowledge, and what can be known with certainty.
  • Explanations should be kept simple. They should call on as few concepts as possible, defined as simply as possible.
  • Reductionism in the service of simplicity is a virtue; reductionism for its own sake can be a vice. Events should be explained at the most basic level possible.
  • We don’t realize how easy it is for us to generate plausible theories.
  • Our approach to hypothesis testing is flawed in that we’re inclined to search only for evidence that would tend to confirm a theory while failing to search for evidence that would tend to disconfirm it.
  • A theorist who can’t specify what kind of evidence would be disconfirmatory should be distrusted.
  • Falsifiability of a theory is only one virtue; confirmability is even more important.
  • We should be suspicious of theoretical contrivances that are proposed merely to handle apparently disconfirmatory evidence but are not intrinsic to the theory.
  • Science is based not only on evidence and well-justified theories—faith and hunches may cause scientists to ignore established scientific hypotheses and agreed-upon facts.
  • The paradigms that underlie a given body of scientific work, as well as those that form the basis for technologies, industries, and commercial enterprises, are subject to change without notice.
  • Different cultural practices and beliefs can produce different scientific theories, paradigms, and even forms of reasoning.
  • Quasi-rational practices by scientists, and cultural influences on belief systems and reasoning patterns, may have encouraged postmodernists and deconstructionists to press the view that there are no facts, only socially agreed-upon interpretations of reality.
  • Our conviction that we know the world directly, by unmediated perception of facts, is what philosophers call “naive realism.”

No comments:

Post a Comment