Chance works in mysterious ways. Shortly after I saw the philosopher Lee McIntyre discussing How to talk to a science denier on the Center for Inquiry website, I ran across his splendid little book The Scientific Attitude: Defending Science from Denial, Fraud, and Pseudoscience. Naturally, I snapped it up.
The scientific attitude is sort of a technical term to McIntyre. He is not interested in demarcating science from non-science, nor in delineating a scientific method; rather, he considers the scientific attitude necessary for science (not that it is limited to scientists). Specifically, McIntyre writes,
The scientific attitude can be summed up in a commitment to two principles:
(1) We care about empirical evidence.
(2) We are willing to change our theories in light of new evidence.
In case you think that these principles may be “too vague and unrigorous to be helpful,” McIntyre discusses “what it means not to care about evidence”: those who do not care about evidence are dogmatic and maintain their beliefs even in defiance of the evidence.
Which brings us to Sir Karl Popper. According to the Introduction, it was Popper who made McIntyre decide to become a philosopher of science. I had known of Popper beforehand, but I was somewhat taken by Popper after I read Conjectures and Refutations: The Growth of Scientific Knowledge* around 25 years ago. Popper was mistaken if he thought that scientists actively attempt to disprove their own theories experimentally. More likely, we try to confirm our theories. On the other hand, in the face of an apparent disconfirmation, we are willing to discard our theories or at least revise them. We may develop an ad hoc (auxiliary) hypothesis in order to preserve our theory – but it is not enough to merely state an ad hoc hypothesis, as many pseudoscientists seem to think; the ad hoc hypothesis, needless to say, must itself be disprovable (testable, falsifiable).
Popper himself notes that some “genuinely testable [falsifiable] theories,” even when found to be false, may be propped up by their “admirers,” who will introduce an ad hoc hypothesis or otherwise reinterpret their theory in such a way that it cannot be refuted. In other words, judgement is called for, and you have to know when to discard a theory and when to patch it up. If you lack judgement, you lack the scientific attitude.
How do you decide whether you have the scientific attitude? You do not just cherry-pick facts but rather
act in accordance with a well-vetted set of practices that have been sanctioned by the scientific community because they have historically led to well-justified beliefs [italics in original].
According to McIntyre’s prescription, Ptolemaic astronomy (epicycles) was science. Phlogiston theory was science. Alchemy was science. Even though these theories were wrong science, they were science, in part because they were discarded and replaced when they were found to be wrong. Astrology, creationism, flat-earth theory are not science because their “admirers” try to prop them up with unprovable or unfalsifiable rationalizations.
McIntyre goes on to discuss the case of Ignaz Semmelweis, a physician who in the mid-1800’s employed careful observation and experimentation, and deduced that the cause of childbed fever (puerperal fever) was caused by what we would today call insufficient hygiene. Semmelweis had the scientific attitude, but, alas, no one else did, and Semmelweis died a failure. In his case, the “well-vetted set of practices that have been sanctioned by the scientific community” were unfortunately wrong, and it was decades before they were set right, with who knows how many lost lives. Stanley Pons and Martin Fleischmann, the “codiscoverers” of cold fusion, did not have the scientific attitude, and neither did a lot of other people, but eventually their “discovery” was falsified and laid to rest by those who did. McIntyre cites observations by Popper and Einstein to the effect that a critical attitude needs to precede the concept of falsifiability in the sense that, if you do not have a critical attitude, you will not seek to falsify your own theories.
I will not go into detail, but McIntyre mostly rejects the social constructivist model and insists that evidence is required for a paradigm shift. Science, to my mind, ought to have a privileged position when it comes to questions of fact because its “fads” (and it has fads) may be chosen to some extent by preference, but their direction is generally dictated by evidence, not belief or ideology. Thomas Kuhn was correct about paradigm shifts, but paradigms in science are hard to shift and, indeed, shift only when the evidence warrants a shift.
I am not a philosopher (though a philosopher of science once called me a naïve falsificationist). I will not discuss the portions of the book that worry about demarcation theory, the social sciences, and so on, though I thought that McIntyre might have been just a wee bit supercilious in discussing how many of the social sciences were merely aspiring to be sciences. At least one of his case studies, though, was appalling.
McIntyre establishes (or perhaps defines) that fraud has to be deliberate: fabrication or falsification of data. Making mistakes, however, is not fraud. McIntyre inveighs against cherry-picking data, searching data for statistical significance, drawing inferences from too small a data set, and so on, though he does not seem to consider those to be fraud. He reminds me of a colleague who once told me, with a grin, that he had to search high and low for an oscilloscope trace that he could call “typical” but which showed all the features he wanted to emphasize.
Another chapter details incidents of “fraud and other failures.” Many are probably familiar to readers of PT, but McIntyre provides a degree of detail that may not be so familiar. Among the frauds, I am sorry to report, is the “vaccine-autism debacle,” which features Andrew Wakefield, continues to this day, and contributes to the anti-vaccination movement and its attendant outbreaks of disease, again with the penalty of countless lost lives.
On what McIntyre deems a happier note, the astronomer Andrew Lyne made an error in a publication, explained it to a scientific conference, and received a standing ovation. That is how science is supposed to work.
Under the head, “denialists, pseudoscientists, and other charlatans,” we meet those who deny well-established theories and those who insist on failed theories. Depressingly, McIntyre notes,
there has always been a human tendency to believe what we want to believe. Superstition and willful ignorance are not new to the human condition. What is new in recent years is the extent to which people can find a ready supply of "evidence" … on the Internet.
He goes on to observe that empirical evidence will not convince people who do not accept empirical evidence. “It is almost as if denialists are making faith-based assertions.” Almost?! He is too kind. Denialists are not skeptics; they are ideologues who refuse to believe conclusions that they do not like but which are based on what everyone else thinks is compelling evidence. Indeed, as he says, the
denialist pattern [is] to have impossibly high standards of proof for the things that one does not want to believe and extremely low standards of acceptance for the things that fit one's ideology.
Under confirmation bias and motivated reasoning, we meet Senator Ted Cruz, who cherry-picks a particularly hot year to “prove” that climate change is a hoax by liberal politicians out for power. It is worth reading McIntyre’s dissection of an NPR interview with Sen. Cruz.
I will conclude with a warning. McIntyre discusses the case of Harlen Bretz and the origin of the channeled scablands in the state of Washington. Like Alfred Wegener and continental drift or Galileo and heliocentric theory, Bretz accounted for the scablands by postulating a massive flood and providing significant supporting evidence for such a flood; his theory is accepted today. In the early to mid-twentieth century, however, geologists rejected it vigorously, in part because uniformitarianism was “favored by geologists [because] it was seen as a bulwark against the creationists.” That is, geologists were putting ideology ahead of science – what McIntyre calls “an example of how ideology can infect the scientific process, whether we are on the ‘right’ side of it or not.”
McIntyre tackles creationism and intelligent-design creationism, which are familiar to readers of PT, and then Robert Jahn’s study of psychokinesis, which I consider flawed science, not pseudoscience. I will not tackle either of those here. Read the book!
* Popper, Karl R., 1968, Conjectures and Refutations: The Growth of Scientific Knowledge, Harper Torchbooks.