Pages

20170409

"THE BELIEVING BRAIN: FROM GHOSTS AND GODS TO POLITICS AND CONSPIRACIES---HOW WE CONSTRUCT BELIEFS AND REINFORCE THEM AS TRUTHS" by Michael Shermer


  • How can we tell the difference between what we would like to be true and what is actually true? The answer is science.
  • More people believe in angels and the devil than believe in the theory of evolution.
  • a majority of people hold some form of paranormal or supernatural belief.
  • Students are taught what to think but not how to think.”
  • So maybe the key to attenuating superstition and belief in the supernatural is in teaching how science works, not just what science has discovered.
  • The brain is a belief engine.
  • patternicity: the tendency to find meaningful patterns in both meaningful and meaningless data.
  • agenticity: the tendency to infuse patterns with meaning, intention, and agency.
  • Our brains evolved to connect the dots of our world into meaningful patterns that explain why things happen.
  • “It is clear that we cannot distinguish the sane from the insane in psychiatric hospitals,” Rosenhan concluded.
  • Your conclusions are only as sound as your premises.
  • The number one predictor of anyone’s religious beliefs is that of their parents and the religious environment of the family.
  • A common myth most of us intuitively accept is that there is a negative correlation between intelligence and belief: as intelligence goes up belief in superstition or magic goes down.
  • once people commit to a belief, the smarter they are the better they are at rationalizing those beliefs.
  • smart people believe weird things because they are skilled at defending beliefs they arrived at for nonsmart reasons.
  • Does God embrace moral principles naturally occurring and external to him because they are sound (“holy”), or are these moral principles sound only because God says that they are sound?
  • The only way to find out if anecdotes represent real phenomena is controlled tests.
  • Type I error in cognition, also known as a false positive, or believing something is real when it is not.
  • Type II error in cognition, also known as a false negative, or believing something is not real when it is.
  • Our brains are belief engines, evolved pattern-recognition machines that connect the dots and create meaning out of the patterns that we think we see in nature. Sometimes A really is connected to B; sometimes it is not.
  • We are the descendants of those who were most successful at finding patterns. This process is called association learning and is fundamental to all animal behavior, from C. elegans to H. sapiens. I call this process patternicity, or the tendency to find meaningful patterns in both meaningful and meaningless noise.
  • Unfortunately, we did not evolve a baloney-detection network in the brain to distinguish between true and false patterns.
  • Patternicity (P) will occur whenever the cost (C) of making a Type I error (TI) is less than the cost (C) of making a Type II error (TII).
  • There was a natural selection for the cognitive process of assuming that all patterns are real and that all patternicities represent real and important phenomena.
  • We are the descendants of the primates who most successfully employed patternicity.
  • The problem we face is that superstition and belief in magic are millions of years old whereas science, with its methods of controlling for intervening variables to circumvent false positives, is only a few hundred years old.
  • Anecdotal thinking comes naturally, science requires training.
  • Our greater capacity for learning is often offset by our greater capacity for magical thinking.
  • Facial-recognition software was built into our brains by evolution because of the importance of the face in establishing and maintaining relationships, reading emotions, and determining trust in social interactions.
  • Faces are important to a social primate species such as ourselves. This is why we are so inclined to see faces in random patterns in nature:
  • People who rate high on internal locus of control tend to believe that they make things happen and that they are in control of their circumstances, whereas people who score high on external locus of control tend to think that circumstances are beyond their control and that things just happen to them.
  • Uncertainty makes people anxious, and anxiety is related to magical thinking.
  • agenticity: the tendency to infuse patterns with meaning, intention, and agency.
  • We are natural-born supernaturalists, driven by our tendency to find meaningful patterns and impart to them intentional agency.
  • The evidence that brain and mind are one is now overwhelming.
  • When asked about Geller’s ability to pass the tests of professional scientists, Randi explained that scientists are not trained to detect trickery and intentional deception, the very art of magic.
  • Herein lies an important lesson that I have learned in many years of paranormal investigations: what people remember happening rarely corresponds to what actually happened.
  • Brain functions can be roughly divided into two processes: controlled and automatic. Controlled processes tend to use linear step-by-step logic and are deliberately employed, and we are aware of them when we use them. Automatic processes operate unconsciously, nondeliberately, and in parallel.
  • All experience is mediated by the brain. The mind is what the brain does. There is no such thing as “mind” per se, outside of brain activity. Mind is just a word we use to describe neural activity in the brain. No brain, no mind.
  • Monists believe that there is just one substance in our head—brain. Dualists, by contrast, believe that there are two substances—brain and mind.
  • beliefs come first; reasons for belief follow in confirmation of the realism dependent on the belief.
  • Spinoza’s conjecture: belief comes quickly and naturally, skepticism is slow and unnatural, and most people have a low tolerance for ambiguity.
  • The scientific principle that a claim is untrue unless proven otherwise runs counter to our natural tendency to accept as true that which we can comprehend quickly.
  • Theory. The deeper reason scientists remain skeptical of psi—and will even if more significant data are published—is that there is no explanatory theory for how psi works.
  • the fact that we cannot fully explain a mystery with natural means does not mean it requires a supernatural explanation.
  • When we are dealing with such topics as the afterlife, there is the problem of fuzzy language in using words such as mind, will, intention, and purpose.
  • The burden of proof is on the believer to prove God’s existence, not on the nonbeliever to disprove God’s existence.
  • There is evidence that God and religion are human and social constructions based on research from psychology, anthropology, history, comparative mythology, and sociology.
  • In other words, agnosticism is an intellectual position, a statement about the existence or nonexistence of the deity and our ability to know it with certainty, whereas atheism is a behavioral position, a statement about what assumptions we make about the world in which we behave.
  • Words matter and labels carry baggage.
  • Science operates in the natural, not the supernatural. In fact, there is no such thing as the supernatural or the paranormal. There is just the natural, the normal, and mysteries we have yet to explain by natural causes.
  • Invoking such words as supernatural and paranormal just provides a linguistic placeholder until we find natural and normal causes, or we do not find them and discontinue the search out of lack of interest.
  • Religion is a social institution that evolved to reinforce group cohesion and moral behavior.
  • The metaphor of memory as a videotape-playback system is completely wrong. There is no recording device in the brain. Memories are formed as part of the association learning system of making connections between things and events in the environment, and repetitive associations between them generate new dendritic and synaptic connections between neurons, which are then strengthened through additional repetition or weakened through disuse. Use it or lose it.
  • Your culture dictates what labels to assign these anomalous brain experiences.
  • But skeptical alarms should toll whenever anyone claims that science has discovered that our deepest desires and oldest myths are true after all.
  • We must always remember how flawed human behavior is, and the natural tendency we all have to make mistakes. Most of the time in most circumstances most people are not nearly as powerful as we think they are.
  • Conspiracy theorists connect the dots of random events into meaningful patterns, and then infuse those patterns with intentional agency. Add to those propensities the confirmation bias and the hindsight bias (in which we tailor after-the-fact explanations to what we already know happened), and we have the foundation for conspiratorial cognition.
  • The belief that a handful of unexplained anomalies can undermine a well-established theory lies at the heart of all conspiratorial thinking. It is easily refuted by noting that beliefs and theories are not built on single facts alone, but on a convergence of evidence from multiple lines of inquiry.
  • By definition, a conspiracy is a secret plan by two or more people to commit an illegal, immoral, or subversive action against another without their knowledge or agreement.
  • But as G. Gordon Liddy once told me, the problem with government conspiracies is that bureaucrats are incompetent and people can’t keep their mouths shut.
  • We do not reason our way to a moral decision by carefully weighing the evidence for and against; instead, we make intuitive leaps to moral decisions and then rationalize the snap decision after the fact with rational reasons.
  • We have evolved a deep sense of empathy and sympathy for others as we imagine ourselves in their position and what a situation would feel like if it were to happen to us.
  • We evolved a natural tendency to defer to authority, show deference to leaders and experts, and follow the rules and dictates given by those above us in social rank.
  • (Take the survey yourself at http://www.yourmorals.org
  • In other words, liberals question authority, celebrate diversity, and often flaunt faith and tradition in order to care for the weak and oppressed. They want change and justice even at the risk of political and economic chaos. By contrast, conservatives emphasize institutions and traditions, faith and family, and nation and creed. They want order even at the cost of those at the bottom falling through the cracks.
  • Libertarianism is grounded in the Principle of Freedom: all people are free to think, believe, and act as they choose, so long as they do not infringe on the equal freedom of others.
  • Given enough opportunities, outlier anomalies will inevitably happen.
  • Thanks to selective memory and the confirmation bias, we will remember only those few astonishing coincidences and forget the vast sea of meaningless data.
  • Folk numeracy is our natural tendency to misperceive probabilities, to think anecdotally instead of statistically, and to focus on and remember short-term trends and small-number runs.
  • A heuristic is a mental method of solving a problem through intuition, trial and error, or informal methods when there is no formal means or formula for solving it (and often even when there is). These heuristics are sometimes called rules of thumb, although they are better known as cognitive biases because they almost always distort percepts to fit preconceived concepts.
  • confirmation bias, or the tendency to seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirming evidence.
  • The confirmation bias is best captured in the biblical wisdom Seek and ye shall find.
  • In a type of time-reversal confirmation bias, the hindsight bias is the tendency to reconstruct the past to fit with present knowledge. Once an event has occurred, we look back and reconstruct how it happened, why it had to happen that way and not some other way, and why we should have seen it coming all along.
  • The self-justification bias is the tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done.
  • fundamental attribution bias, or the tendency to attribute different causes for our own beliefs and actions than that of others.
  • the status quo bias, or the tendency to opt for whatever it is we are used to, that is, the status quo.
  • the endowment effect, or the tendency to value what we own more than what we do not own.
  • Thaler has found that owners of an item value it roughly twice as much as potential buyers of the same item.
  • The longer we hold a belief, the more we have invested in it; the more publicly committed we are to it, the more we endow it with value and the less likely we are to give it up.
  • How beliefs are framed often determines how they are assessed, and this is called the framing effect, or the tendency to draw different conclusions based on how data are presented.
  • Lacking some objective standard to evaluate beliefs and decisions—which is usually not available—we grasp for any standard on hand, no matter how seemingly subjective. Such standards are called anchors, and this creates the anchoring effect, or the tendency to rely too heavily on a past reference or on one piece of information when making decisions.
  • Hundreds of experiments reveal time and again that people make snap decisions under high levels of uncertainty, and they do so by employing these various rules of thumb to shortcut the computational process.
  • Psychologists call this inattentional blindness, or the tendency to miss something obvious and general while attending to something special and specific.
  • Inattentional blindness is the tendency to miss something obvious and general while attending to something special and specific.
  • The bias blind spot is really a meta-bias in that it is grounded in all the other cognitive biases. It is the tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs.
  • Extraordinary events do not always require extraordinary causes. Given enough time and opportunity, they can happen by chance.
  • To reiterate my thesis: beliefs come first, the explanations for the beliefs follow.
  • “If it disagrees with experiment, it is wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong. That’s all there is to it.”
  • When you look out into space the distances are so enormous that you are looking back into time; appropriately, astronomers call this lookback time.
  • Science begins with something called a null hypothesis. Although statisticians mean something very specific about this (having to do with comparing different sets of data), I am using the term null hypothesis in its more general sense: the hypothesis under investigation is not true, or null, until proven otherwise.
  • in my experience what people say they can do and what they can actually do are not always the same.
  • The null hypothesis also means that the burden of proof is on the person asserting a positive claim, not on the skeptics to disprove it.
  • Yes, governments lie to their citizens, but lying about X does not make Y true. Terrestrial secrets do not equate to extraterrestrial cover-ups.
  • So many claims of this nature are based on negative evidence. That is, if science cannot explain X, then your explanation for X is necessarily true. Not so. In science lots of mysteries remain unexplained until further evidence arises, and problems are often left unsolved until another day.
  • “When the mind is confronted with more information than it can absorb, it looks for meaningful (and usually confirmatory) patterns. As a consequence, we tend to minimize evidence that is incongruous with our expectations, causing the dominant worldview to bring about its own reaffirmation.”
  • The principle of positive evidence states that you must have positive evidence in favor of your theory and not just negative evidence against rival theories. The principle of positive evidence applies to all claims.
  • Most religious claims are testable, such as prayer positively influencing healing.
  • An emotional leap of faith beyond reason is often required just to get through the day, let alone make the big decisions in life.

No comments:

Post a Comment