Given its subtitle, I expected something different from Weirdness!. I teach logic and critical thinking primarily by teaching students how to think about weird things. Schick and Vaughn's How to Think About Weird Things is my primary text.1 Because Edis also teaches a class on weird things, I expected case studies in weirdness, followed by examinations, ultimate debunkings, and then reflections on what the process teaches us about the nature of science. “Investigating ghosts teaches us the value of simplicity. Debunking conspiracy theories teaches us about the irrationality of ad hoc reasoning.” But that's not what Weirdness! is.
Instead, it is loosely organized general reflections on what he has concluded about science (and many other matters) through teaching his class. Also interspersed are seemingly non-sequitur fictional stories about things like a scientist who flipped coins that always came up heads or a woman who was driven mad after gaining omniscience. As such, it's a bit difficult to follow. The purpose or thesis of chapters is often not clear until the last paragraph (if at all), and the reader often struggles to divine the relevance of the fictional stories (although, some readers might find this fun).
That said, he gets a lot right. For example, the fact that ghosts don't exist, that conspiracy theories are bunk, that UFOs aren't real, that near-death experiences are a result of neurological activity, that alternative medicine doesn't work and is harmful, and that mystical experiences don't justify religious beliefs, are all communicated in one way or another throughout the book. Edis is a good scientist who has never been fooled into thinking some weird thing is real when it's not.
What's more, although not mentioned by name, he offers a solid critique of methodological naturalism, the idea that (because of its approach and philosophical underpinnings) science can investigate only natural phenomena, and thus can't confirm or rule out the existence of the supernatural. In reality, if ghosts or psychic powers were real, their existence could be revealed by scientific testing; and that it hasn't is adequate justification that such things don't exist. Methodological naturalism may be an apologist's way to make the religious more comfortable with science—“Science by its nature doesn't deal with the supernatural, so it can't disprove it”—but it's not an accurate description of what science is (or can do). As Edis says about the idea that science and religion are in separate domains and so cannot possibly conflict—what is also known as Stephen J. Gould's non-overlapping magisteria thesis2—“Separating the spheres of science and religion is useful for keeping the peace, but it is intellectually dubious.” (p. 110)
Edis does not, however, think that there is one, defining, scientific method. He rightly points out that falsification is not everything in science, and that the classic “hypothesize, predict, test, repeat” articulation of the scientific method is grossly inadequate. (I, myself, have elaborated on this.)3 Indeed, science's flexibility is what makes it able to investigate supernatural claims. But this is also where Edis gets a few things wrong.
To prove his point about falsifiability, he compares (and essentially equates) how creationists excuse away evidence against creationism (e.g., the devil planted the fossils to test our faith) to the way that scientists have defended their current theories about gravity: “Those distant galaxies must not move as our current gravitational theories predict because they contain matter we cannot see (i.e., dark matter).” But there is a huge difference between making a fundamentally untestable assumption (devil fossils) to save a theory that had no good evidence in the first place (creationism) and making a theoretically testable assumption (dark matter) to preserve a theory that has vast explanatory power and is supported by centuries of evidence (gravitational theory). The former is called making up “ad hoc excuses” and demonstrates desperation and irrationality. The latter does not.
Moreover, while Edis is certainly right that scientific reasoning looks very different and uses different tools, in different fields and contexts for different purposes (and he rightly identifies what my colleagues in the history department do as a kind of science), he is wrong that science does not have an underlying method. As Ernan McMullin argued in The Inference that Makes Science, scientific reasoning is, at its base, inference to the best explanation.4 In fact, Edis unwittingly acknowledges this on page 119. “When we ask about the nature of science or probe what might be wrong with some weirdness, we do something much like a science of science—an explanation of how good explanations work.” If, in doing a science of science (trying to figure out what science is), we are trying to figure out what makes explanations good, it must be that science is an effort to find the best explanation.
Still, Edis is on the right track. Inference to the best explanation, as a method, incorporates all different kinds of reasoning—double blinded tests, bayesianism, careful observation, avoidance of formal and informal fallacy, deduction, etc.—and won't look exactly the same every time. Although it does generally appeal to the same criteria—what Schick and Vaughn identify as testability (i.e., falsifiability), fruitfulness (i.e., successful prediction), scope (i.e., explanatory power), simplicity (i.e., parsimony), and conservatism (i.e., how much it aligns with already established theories)5—in different circumstances, certain of those criteria will be more important, and certain kinds of evidence and reasoning will be more relevant. Nevertheless, inference to the best explanation is a single method that can be articulated and taught.
Consider an analogy that would fit well in Edis' book. Science is like building a house. There are all different ways of going about it, and all different kinds of tools that one might need while doing it—and in certain circumstances (doing the plumbing) certain tools will be used more than others. But that doesn't mean that there isn't a method for building houses (that requires a certain “housebuilders tool kit”).
Edis also makes many astute observations, like why those in the practical sciences (engineers, nurses, and doctors), who use science (and technology) to accomplish specific tasks, are more apt to believe in pseudoscience and other false weird things. They more often (wrongly) think of science as a collection of facts, not (rightly) as a method for discovering the truth about the world. Consequently, they often only know how to use the kinds of scientific reasoning applicable to their own field and do not know how to apply scientific reasoning more broadly. (This accounts for the frustrations of a colleague of mine, who regularly complains that too many of his graduating chemistry majors still believe in things like Bigfoot.)
Weirdness! also gave me a new appreciation for why certain people refuse to think about weirdness: the intellectual and social cost is too high. To understand arguments for why the belief in Jesus' resurrection is both historically and scientifically illegitimate, one would have to invest hours of research about the historicity and authorship of the Bible, the evidence for and against the resurrection, and an understanding of how reasoning about past events functions. That's a lot of work. And what's the payoff? For many, it seems to be ostracism from friends, family, and community. No wonder people close their ears. (But if you are interested, see my article “Inference to the Best Explanation and Rejecting the Resurrection.”)6
There are shortcuts through that kind of cognitive load: you can just listen to the experts. But as Edis again acutely observes, not only can experts sometimes be wrong (although, for the layperson, it is still rational to listen to them even when they might be wrong), relying on experts also comes with its own cognitive costs. It takes time and effort to figure out who the relevant experts are and whether someone is speaking outside their field of expertise. Is there a consensus, or is there genuine disagreement among the experts? Although this takes much less work than doing your own research, it's not easy.
It's in this context that Edis makes the useful distinction between instrumental rationality and reflective rationality. Instrumental rationality considers the social and cognitive costs of true belief. Will accepting the truth make life less comfortable? Will it cost you friends or ostracize you from family? How much time and cognitive effort will you have to spend to discover it? If the cost is too high, instrumental rationality would suggest the quest for truth is not worth it. Reflective rationality, however, ignores all that. It demands dedication to truth no matter the cost. Both can be rational, Edis argues, but while he seems to favor reflective rationality in most situations himself, he makes clear that it is not for everyone (p. 171), and even says that “a devotion to truth above every other interest would be a form of fanaticism.” (p. 222).
While I am certainly sensitive to the fact that the necessary time and cognitive effort to seek the truth is not available to everyone (and seeking the truth on all matters is impossible), I hesitate to embrace the idea that the social costs of the truth can be a good reason to ignore it. Not only do I think that the lessons of Plato's cave allegory hit home—being duped into believing false things about the nature of the world is an objectively pitiful state to be in—but the allegory that Edis tells before ultimately defending instrumental rationality is ethically problematic. It describes members of a fictional nation state embracing its revisionist history (a history fed to them by their government and society which preserves the notion of their nation's superiority) because the social costs of embracing the truth are too high. This, it seems, echoes the concerns of those who complain about U.S. history courses that don't whitewash the truth about the United States' numerous moral crimes (e.g., how it treated natives, Africans, and immigrants).7 It seems only people who want to repeat those crimes would object to studying them; yet the need to be instrumentally rational could be used to defend their objections. Or to put my point more simply: the social costs of rejecting Nazism were high in 1930s Germany; it was instrumentally rational to embrace it; the social costs of not doing so were high. That didn't make it morally acceptable to do so.
Edis might respond by suggesting that there are no moral facts. “Science can't deliver hard moral facts, because there are no such facts” (p. 175). He even decries those who have raised moral concerns about some of his works (on how Muslims distort history) as “moralists.” “It's the blasted moralists who ruin everything with their moral panics and witch hunts and thought police” (p. 178). While I think the complaints about his research were overblown, his complaints here about “moralists” rhyme a bit too much with complaints about “virtue signaling,” a term used by those on the far-right (about whom Edis complains) to dismiss genuine moral concerns.8 What's more, Edis seems unaware that, despite professing there to be no moral facts, he seemingly lodges many moral complaints throughout the book (not the least of which, against the “moralists” themselves and against those on the far right). That's the problem with moral nihilism; you can't even consistently celebrate that the Allies won WWII.
While reading the book I also became worried that Edis might be encouraging his students to fall into a trap identified by Michael Caulfield in his book Web Literacy for Student Fact-Checkers.9 Simply encouraging students to “think critically” about weird things can backfire; what conspiracy theorists and pseudoscientists want most is your attention, to simply be heard—because the more they are, the more people will believe. Without the proper mindset, experience, and cognitive tools, one can easily be fooled. This is why so many who say they “did their own research” end up believing in weird (and dangerous) things (such as vaccine denial). When Edis said that he lets his students dictate the content of his class each semester by selecting the subjects they will explore, it seemed to me that he may be taking this erroneous approach. My worries were somewhat vindicated when Edis told a story about a student in his class who used what he learned from Edis to defend intelligent design—although, thankfully, that student did return years later to say that he had rejected his former beliefs.
Edis also seems to make a mistake in his argument that evolutionary explanations for why we are predisposed to believe weird things are insufficient to debunk them. He thinks that scientific reasoning is also something that evolved, and thus such an argument would give us reason to doubt science too. But this is a mistake similar to one made by those who advocate for theism with what is known as the argument from reason. “The only way reason is reliable as a truth preserving process,” the argument suggests, “is if God gave it to us. If it evolved, it's just for survival—and thus is not truth preserving.” What such philosophers (and Edis) miss is that scientific reasoning was not “selected for” by evolution. What was selected for was our larger brains, likely because they enabled us to make and throw spears; the fact that we had larger brains did eventually allow us to develop and engage in scientific reasoning; but that was something that was developed, relatively recently, by many people over time. It did not convey survival potential. Unlike instinctive reasoning processes—such as agent detection—that likely were selected for in our ancient past (and are not reliable), scientific reasoning is an evolutionary spandrel, a byproduct of the evolution of some other characteristic. (For more on the argument from reason, see my chapters in Greg Bassham's C.S. Lewis's Christian Apologetics.)10
I very much appreciated Edis' observations about Philosophy of Religion, which has essentially devolved into Christian apologetics and often engages in the same kind of fallacious reasoning used to defend all manner of weird things.11 I was disappointed, however, that he did not go into more detail about why he “discourage[s] students from diagnosing fallacies associated with paranormal claims [because] most so-called fallacies actually work well in everyday situations, and any notion of rationality that attempts to reduce reasoning to a list of rules will self-immolate in short order” (p. 151–152). In my experience—while it is important to teach students to avoid the fallacy fallacy (the fact that an argument contains a fallacy does not necessarily entail that its conclusion is false) and to realize that naming the fallacy in another person's argument will likely not convince them they are wrong—learning to identify and avoid fallacies is essential to critical thinking and using them in everyday situations usually has bad results.
I was also disappointed that there was no discussion of (perhaps) the weirdest notion of all: the hypothesis that we live in a computer simulation. After all, Oxford philosopher Nick Bostrom has argued that the probability is as high as 20%;12 and Marcus Arvan, with his peer-to-peer simulation hypothesis, has essentially argued that, although it is not testable, the theory is not irrational because it could provide a unified theory that answers basically every major unanswered question in both physics and philosophy13 (and physics and philosophy are Edis' two main interests!). To me, this is a prime example of why falsifiability is not everything in science.
Still, generally, Weirdness! was a good read; and (although I will not use it in class) I would recommend it to those who are interested in weird things. But a word of warning about the last chapter: Not only is its story about how Turkey went from secular to fascistically religious an eerie foreshadowing of what is happening in the U.S., but the chapter also delves into the current deplorable state of science education, lack of respect for science, and how that relates to our (lack of) hope for widespread governmental action on climate change. It left me with the same trepidation I had after watching the movie Don't Look Up. “Our species is doomed.” So, while not a light-hearted read, Weirdness! is a book that stimulates deep thinking about important issues.
David Kyle Johnson is a professor of Philosophy at King's College (PA), a professor for The Great Courses, Executive Officer for the Global Center for Religious Research, and specializes in philosophy of religion, logic, metaphysics, and critical thinking.