Reading: A Skeptical Manifesto by Michael Shermer

A Skeptical Manifesto by Michael Shermer

Shermer, Michael. Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time. New York: W. H. Freeman, 1997.

 

Instructions:

Read the following essay. While reading, think about the answers to the questions in the boxes. Click on the tabs above for optional considerations.

Objectives:

  • After this module, you should be able to:
  • Explain Shermer's skepticism.
  • Analyze the arguments put forth by Shermer.
  • Debate the ideas presented by Shermer.
  • Demonstrate an understanding of skepticism.
  • Communicate ideas in the video and reading to your classmates.

What does it mean to be skeptical? Skepticism has a long historical tradition dating back to ancient Greece when Socrates observed: “All I know is that I know nothing.” But this is not a practical position to take. Modern skepticism is embodied in the scientific method that involves gathering data to formulate and test naturalistic explanations for natural phenomena. A claim becomes factual when it is confirmed to such an extent that it would be reasonable to offer temporary agreement. But all facts in science are provisional and subject to challenge, and therefore skepticism is a method leading to provisional conclusions. Some claims, such as water dowsing and ESP, have been tested (and failed the tests) often enough that we can provisionally conclude that they are false. Other claims, such as hypnosis and chaos theory, have been tested but results are inconclusive so we must continue formulating and testing hypotheses and theories until we can reach a provisional conclusion. The key to skepticism is to continuously and vigorously apply the methods of science to navigate the treacherous straits between “know nothing” skepticism and “anything goes” credulity. This manifesto — a statement of purpose of sorts — explores these themes further.

The History, Meaning & Limits of Skepticism

The modern skeptical movement is a fairly recent phenomenon dating back to Martin Gardner’s 1952 classic, Fads and Fallacies In the Name of Science. Gardner’s copious essays and books over the past four decades debunking all manner of bizarre claims, coupled to James “the Amazing” Randi’s countless psychic challenges and media appearances throughout the 1970s and 1980s (including 36 appearances on The Tonight Show), pushed the skeptical movement to the forefront of public consciousness. The philosopher Paul Kurtz helped create dozens of skeptics groups throughout the United States and abroad, and his Committee for the Scientific Investigation of Claims of the Paranormal (CSICOP) inspired me to found the Skeptics Society and Skeptic magazine, now with both national and international membership and circulation. There is today a burgeoning group of people calling themselves skeptics — scientists, engineers, physicians, lawyers, professors and teachers, and the intellectually curious from all walks of life — who conduct investigations, hold monthly meetings and annual conferences, and provide the media and general public with natural explanations for apparently supernatural phenomena.

But skepticism as a way of thinking has a long historical tradition that can be traced back at least 2,500 years. The foremost historian of skepticism, Richard Popkin, tells us (1979, p. xiii): “Academic skepticism, so-called because it was formulated in the Platonic Academy in the third century, B.C., developed from the Socratic observation, ‘All I know is that I know nothing.’” Two of the popular received meanings of the word by many people today are that a skeptic believes nothing, or is closed minded to certain beliefs. There is good reason for the perception of the first meaning. The Oxford English Dictionary (OED) gives this common usage for the word skeptic:

One who, like Pyrrho and his followers in Greek antiquity, doubts the possibility of real knowledge of any kind; one who holds that there are no adequate grounds for certainty as to the truth of any proposition whatever (Vol. 2, p. 2663).

Since this position is sterile and unproductive and held by virtually no one (except a few confused solipsists who doubt even their own existence), it is no wonder that so many find skepticism disturbing. A more productive meaning of the word skeptic is the second usage given by the OED:

One who doubts the validity of what claims to be knowledge in some particular department of inquiry; one who maintains a doubting attitude with reference to some particular question or statement.

It is easy, even fun to challenge others’ beliefs, when we are smug in the certainty of our own. But when ours are challenged, it takes great patience and ego strength to listen with an unjaundiced ear. But there is a deeper flaw in pure skepticism. Taken to an extreme the position by itself cannot stand. The OED gives us this 1674 literary example (Tucker Lt. Nat. II):

There is an air of positiveness in all skepticism, an unreserved confidence in the strength of those arguments that are alleged to overthrow all the knowledge of mankind.

Skepticism is itself a positive assertion about knowledge, and thus turned on itself cannot be held. If you are skeptical about everything, you would have to be skeptical of your own skepticism. Like the decaying sub-atomic particle, pure skepticism uncoils and spins off the viewing screen of our intellectual cloud chamber.

Nor does skepticism produce progress. It is not enough simply to reject the irrational. Skepticism must be followed with something rational, or something that does produce progress. As the Austrian economist Ludwig von Mises warned against those anti-communists who presented no rational alternative to the system of which they were so skeptical (1956, p. 112):

An anti-something movement displays a purely negative attitude. It has no chance whatever to succeed. Its passionate diatribes virtually advertise the program they attack. People must fight for something that they want to achieve, not simply reject an evil, however bad it may be.

Carl Sagan sounded a similar warning to skeptics:

You can get into a habit of thought in which you enjoy making fun of all those other people who don’t see things as clearly as you do. We have to guard carefully against it (in Basil, 1988, p. 366).

 

Comprehension Questions:

  1. What does it mean to say that a claim is factual?
  2. For Shermer, what is the key to skepticism?
  3. What is the logical problem with pure skepticism?

The Rational Skeptic

The second popular notion that skeptics are closed-minded to certain beliefs comes from a misunderstanding of skepticism and science. Skeptics and scientists are not necessarily “closed-minded” (though they may be since they are human). They may once have been open-minded to a belief, but when the evidence came up short they rejected it. There are already enough legitimate mysteries in the universe for which evidence provides scientists fodder for their research. To take the time to consider “unseen” or “unknown” mysteries is not always practical. When the non-skeptic says, “you’re just closed-minded to the unknown forces of the universe,” the skeptic responds: “We’re still trying to understand the known forces of the universe.”

It is for these reasons that it might be useful to modify the word skeptic with “rational.” Again, it is constructive to examine the usage and history of this commonly used word. Rational is given by the OED as: “Having the faculty of reasoning; endowed with reason” (p. 2420). And reason as “A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief” (p. 2431). It may seem rather pedantic to dig through the dictionary and pull out arcane word usages and histories. But it is important to know how a word was intended to be used and what it has come to mean. They are often not the same, and more often than not, they have multiple usages such that when two people communicate they are frequently talking at cross purposes. One person’s skepticism may be another’s credulity. And who does not think they are rational when it comes to their own beliefs and ideologies?

It is also important to remember that dictionaries do not give definitions; they give usages. For a listener to understand a speaker, and for a reader to follow a writer, important words must be defined with semantic precision for communication to be successful. What I mean by skeptic is the second usage above: “One who doubts the validity of what claims to be knowledge in some particular department of inquiry.” And by rational: “A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief.” But these usages leave out one important component: the goal of reason and rationality. The ultimate end to thinking is to understand cause-and-effect relationships in the world around us. The goal is to know the universe, the world, and ourselves. Since rationality is the most reliable means of thinking, a rational skeptic may be defined as:

One who questions the validity of particular claims of knowledge by employing or calling for statements of fact to prove or disprove claims, as a tool for understanding causality.

In other words, skeptics are from Missouri — the “show me” state. When we hear a fantastic claim we say, “that’s nice, prove it.”

Let me offer an example of how a rational skeptic might analyze a claim. For many years I had heard stories about the so-called “Hundredth-Monkey phenomenon” and was fascinated with the possibility that there might be some sort of collective consciousness into which we can tap to decrease crime, eliminate wars, and generally unite as a single species. In the last presidential election, in fact, one candidate — Dr. John Hagelin from the Natural Law Party — claimed that if elected he had a plan solve the problems of our inner cities — meditation.

Hagelin and others (especially proponents of Transcendental Meditation), believe that thought can somehow be transferred between people, especially in a meditative state; if enough do it at the same time, some sort of critical mass will be reached and thereby induce significant planetary change. The Hundredth-Monkey phenomenon is commonly cited as empirical proof of this astonishing claim. In the 1950s, so the story goes, Japanese scientists gave monkeys on Koshima Island potatoes. One day one of the monkeys learned to wash the potatoes and then taught the skill to others. When about 100 monkeys had learned the skill — the so-called critical mass — suddenly all the monkeys automatically knew it, even those on other islands hundreds of miles away. The belief is widespread in New Age circles: Lyall Watson’s Lifetide (1979) and Ken Keyes’s The Hundredth Monkey (1982), for example, have been through multiple printings and sold millions copies; and Elda Hartley made a film called ‘The Hundredth Monkey’.

As an exercise in skepticism we should start by asking if these events really happened as reported. They did not. In 1952, primatologists began providing Japanese macaques with sweet potatoes to keep them from raiding local farms. One of them did learn to wash dirt off the potatoes in a stream or the ocean, and other monkeys learned to model the behavior (modeling is a normal part of primate behavior — “monkey see, monkey do” predates the New Age). Now let’s examine Watson’s claim more carefully. He admits that “one has to gather the rest of the story from personal anecdotes and bits of folklore among primate researchers, because most of them are still not quite sure what happened. So I am forced to improvise the details.” Watson then speculates that “an unspecified number of monkeys on Koshima were washing sweet potatoes in the sea,” hardly the level of precision required to justify so far-reaching a conclusion. He then makes this astonishing statement:

Let us say, for argument’s sake, that the number was 99 and that at 11:00 a.m. on a Tuesday, one further convert was added to the fold in the usual way. But the addition of the hundredth monkey apparently carried the number across some sort of threshold, pushing it through a kind of critical mass.

At this point, says Watson, the habit “seems to have jumped natural barriers and to have appeared spontaneously on other islands.”

One need go no further. Scientists do not “improvise” details or make wild guesses from “anecdotes” and “bits of folklore.” But there is more. In fact, some real scientists did record exactly what happened. The troop began with 20 monkeys in 1952 and reached 59 by 1962, and every monkey on the island was carefully observed. By March of 1958 exactly 17 of 30 monkeys; and by 1962 exactly 36 of 49 monkeys had modeled the washing behavior. The “sudden” acquisition of the behavior actually took four years, and the “100 monkeys” were actually only 17 in 1958 and 36 in 1962. And while there are some reports of similar behavior on other islands, the observations were made between 1953 and 1967. It was not sudden, nor was it connected in any way to Koshima. The monkeys on other islands could have discovered this simple skill themselves; or researchers or inhabitants of the islands might have taught them; or monkeys from Koshima might have been taken there. In any case, there is nowhere near the evidence necessary to support this extraordinary claim. There is not even any real phenomenon to explain.

 

Comprehension Questions:

  1. What is the goal of the rational skeptic?
  2. How is the Hundredth-Monkey phenomenon based on a mistaken notion of causality?

Science & Skepticism

 Skepticism, then, is a vital part of science. Reviewing the usages and history of the word science would be inappropriately long here (see Chapter 2). For purposes of clarity science will be taken to mean:

a set of mental and behavioral methods designed to describe and interpret observed or inferred phenomenon, past or present, aimed at building a testable body of knowledge open to rejection or confirmation.

In other words, science is a specific way of thinking and acting — a tool for understanding information that is perceived directly or indirectly (“observed or inferred”). “Past or present” refers to both the historical and the experimental sciences. Mental methods include hunches, guesses, ideas, hypotheses, theories, and paradigms; behavioral methods include background research, data collection, data organization, colleague collaboration and communication, experiments, correlation of findings, statistical analyses, manuscript preparation, conference presentations, and publications. What then is the scientific method? One of the more insightful and amusing observations was made by the Nobel laureate and philosopher of science, Sir Peter Medawar (1969, p. 11):

Ask a scientist what he conceives the scientific method to be and he will adopt an expression that is at once solemn and shifty-eyed: solemn, because he feels he ought to declare an opinion; shifty-eyed, because he is wondering how to conceal the fact that he has no opinion to declare.

A sizable body of literature exists on the scientific method and there is little consensus among the authors. This does not mean that scientists do not know what they are doing. Doing and explaining may be two different things. For the purpose of outlining a methodology for the rational skeptic to apply to questionable claims, the following four step process may represent, on the simplest of levels, something that might be called the “scientific method”:

Observation: Gathering data through the senses or sensory enhancing technologies.

Induction: Drawing general conclusions from the data. Forming hypotheses.

Deduction: Making specific predictions from the general conclusions.

Verification: Checking the predictions against further observations.

Science, of course, is not this rigid; and no scientist consciously goes through such “steps.” The process is a constantly interactive one between making observations, drawing conclusions, making predictions, and checking them against further evidence. This process constitutes the core of what philosophers of science call the hypothetico-deductive method, which involves:

  1. putting forward a hypothesis,
  2. conjoining it with a statement of ‘initial conditions’,
  3. deducing from the two a prediction, and
  4. finding whether or not the prediction is fulfilled

(Bynum, Browne, Porter, 1981, p. 196)

 It is not possible to say which came first, the observation or the hypothesis, since we do both from childhood, through school, to college, into graduate training, and on the job as scientists. But observations are what flesh out the hypothetico-deductive process and serve as the final arbiter for the validity of the predictions, as Sir Arthur Stanley Eddington noted: “For the truth of the conclusions of science, observation is the supreme court of appeal” (1958, p. 9). Through the scientific method we may form the following generalizations:

Hypothesis: A testable statement to account for a set of observations.

Theory: A well-supported testable statement to account for a set of observations.

Fact: Data or conclusions confirmed to such an extent it would be reasonable to offer temporary agreement.

A hypothesis and theory may be contrasted with a construct: a non-testable statement to account for a set of observations. The observation of living organisms on Earth may be accounted for by God or by evolution. The first statement is a construct, the second a theory. Most biologists would even call evolution a fact by the above definition.

Through the scientific method we aim for objectivity: the basing of conclusions on external validation. And we avoid mysticism: the basing of conclusions on personal insights that lack external validation. There is nothing wrong with personal insight. Many great scientists have attributed important ideas to insight, intuition, and other equally difficult-to-define concepts. Alfred Wallace said that the idea of natural selection “suddenly flashed upon” him during an attack of malaria. Timothy Ferris called Einstein, “the great intuitive artist of science.” But insightful and intuitive ideas do not gain acceptance until they are externally validated, as Richard Hardison explained (1988, p. 259-260):

Mystical “truths,” by their nature, must be solely personal, and they can have no possible external validation. Each has equal claim to truth. Tea leaf reading and astrology and Buddhism; each is equally sound or unsound if we judge by the absence of related evidence. This is not intended to disparage any one of the faiths; merely to note the impossibility of verifying their correctness. The mystic is in a paradoxical position. When he seeks external support for his views he must turn to external arguments, and he denies mysticism in the process. External validation is, by definition, impossible for the mystic.

Science leads us toward rationalism: the basing of conclusions on the scientific method. For example, how do we know the Earth is round?:

  1. The shadow on the moon is round.
  2. The mast of a ship is the last thing seen as it sails off the horizon.
  3. The horizon is curved.
  4. Photographs from space.

And science helps us avoid dogmatism: the basing of conclusions on authority rather than science. For example, how do we know the Earth is round?:

  1. Our parents told us.
  2. Our teachers told us.
  3. Our minister told us.
  4. Our textbook told us.

Dogmatic conclusions are not necessarily invalid, but they do pose another question: how did the authorities come by their conclusions? Did they use science or some other means?

 

Comprehension Questions:

  1. What is the hypothetico-deductive method?
  2.  What is mysticism?

It is important that we recognize the fallibility of science and the scientific method. But within this fallibility lies its greatest strength: self-correction. Whether mistakes are made honestly or dishonestly, whether a fraud is unknowingly or knowingly perpetrated, in time it will be flushed out of the system through the lack of external verification.

Because of the importance of this self-correcting feature, there is in the profession what Richard Feynman calls “a principle of scientific thought that corresponds to a kind of utter honesty — a kind of leaning over backwards.” Feynman says:

If you’re doing an experiment, you should report everything that you think might make it invalid — not only what you think is right about it: other causes that could possibly explain your results (1988, p. 247).

Despite these built in mechanisms science is still subject to a number of problems and fallacies that even the most careful scientist and rational skeptic are aware can be troublesome. We can, however, find inspiration in those who have overcome them to make monumental contributions to our understanding of the world.

The Tool of the Mind

Science is the best method humankind has devised for understanding causality. Therefore the scientific method is our most effective tool for understanding the causes of the effects we are confronted with in our personal lives as well as in nature. There are few human traits that most observers would call truly universal. Most would consent, however, that survival of the species as a whole, and the achievement of greater happiness of individuals in particular, are universals that most humans seek. We have seen the interrelationship between science, rationality, and rational skepticism. Thus, we may go so far as to say that the survival of the human species and the attainment of greater happiness for individuals depends on the ability to think scientifically, rationally, and skeptically.

It is assumed that human beings are born with the ability to perceive cause-and-effect relationships. When we are born we have no cultural experience whatsoever. But we do not come into the world completely ignorant. We know lots of things — how to see, hear, digest food, track a moving object in the visual field, blink at approaching objects, become anxious when placed over a ledge, develop a taste aversion for noxious foods, and so on. We also inherit the traits our ancestors evolved in a world filled with predators and natural disasters, poisons and dangers, and risks from all sides. We are descended from the most successful ancestors at understanding causality.

Our brains are natural machines for piecing together events that may be related and for solving problems that require our attention. One can envision an ancient hominid from Africa chipping and grinding and shaping a rock into a sharp tool for carving up a large mammalian carcass. Or perhaps we can imagine the first individual who discovered that knocking flint would create a spark with which to light a fire. The wheel, the lever, the bow and arrow, the plow — inventions intended to allow us to shape our environment rather than be shaped by it — started civilization down a path that led to our modern scientific and technological world.

One of the characteristics that sets man apart from all the other animals (and animal he indubitably is) is a need for knowledge for its own sake. Many animals are curious, but in them curiosity is a facet of adaptation. Man has a hunger to know. And to many a man, being endowed with the capacity to know, he has a duty to know. All knowledge, however small, however irrelevant to progress and well-being, is a part of the whole. It is of this the scientist partakes. To know the fly is to share a bit in the sublimity of Knowledge. That is the challenge and the joy of science.