Lee McIntyre has written a book aimed at the general reader in which he argues that what makes science special is something he calls the "scientific attitude." By the scientific attitude, he means a willingness to change one's theory in the light of new empirical evidence. This attitude is a community ethos, not a psychological trait of individual scientists. To be a scientist is to participate in a community that has agreed to critically evaluate all ideas in the light of empirical evidence.
For McIntyre, the claim that science is characterized by this attitude is a normative, rather than a descriptive thesis. He does not propose that the scientific attitude can serve as a demarcation criterion. It is only a necessary and not a sufficient condition for something to be considered science; he feels no need to show that everything with this attitude is science. Rather, he aims to show only that without this attitude, something is not a science. It's what separates science from climate change denialism, creationism and intelligent design theory, and the anti-vaccination movement. He maintains that a lot can be learned by looking at how these fields fail to be science.
The first two chapters cover material that will be largely familiar to philosophers of science and scholars in related disciplines. First, he explains how philosophers have been unable to demarcate science from other endeavors by appealing to what has been regarded as the scientific method. They have not even always been clear about whether they are trying to separate science from non-science or just from pseudoscience. Then he turns to clarify misconceptions about how science works, such as the naïve view that there is something called the scientific method that starts with observation and achieves complete certainty in its results. He reviews the problem of induction and the problem of the ambiguity of negative test results, and discusses the role of theory in the sciences.
McIntyre gets to the heart of his thesis in the third and fourth chapters, explaining what he means by the scientific attitude and how it belongs to the ethos of the scientific community. Although this attitude is supposed to be a community norm, it appears that sometimes only a lone individual lives up to it. Here he provides the familiar case of Dr. Semmelweis' work with childbed fever, and in a later chapter an example from geology, in which J. Harlen Bretz's explanation of the scablands in the state of Washington turned out to be correct, while the scientific community was wrong. In both cases, however, the community eventually comes around. The scientific attitude ultimately prevails. The community's response to Pons and Fleischmann's claims to have achieved cold fusion in the chemistry laboratory is another example of the scientific attitude at work, an attitude that Pons and Fleischmann themselves apparently lacked.
The fifth chapter is perhaps the most interesting one. Here McIntyre shows how the scientific attitude came into play in scientists' response to various types of intentional and unintentional error and bias, including the p-hacking crisis in psychology as well as the Pons and Fleischmann case. P-hacking is simply attempting to find statistically significant correlations in a data set, often with the help of computers, with no hypothesis being tested. Joseph Simmons and his co-authors alerted the scientific community to this practice in an article in Psychological Science in 2011. In response, some journals have stopped asking for p-values and at least one, Basic and Applied Psychology, has announced it will no longer publish them (p. 94 and note 46). What this example shows -- although McIntyre does not quite explicitly say so -- is that what unites a scientific community is not simply the sharing of standards, as too many philosophers of science have previously mistakenly assumed. In response to the p-hacking crisis, the scientific community was able to raise its standards and yet remain a community. I think this may be why McIntyre focuses on attitude rather than methods or standards as the distinguishing characteristic of science. For McIntyre, standards appear to be something that are explicitly stated. He writes of "agreed-upon, transparent standards" (57) and an "ideal set of rules . . . agreed on in advance" (81). The scientific attitude, on the other hand, is simply assumed in the scientific community.
From McIntyre's account of the p-hacking crisis and how it was resolved, it would appear that the scientific attitude goes beyond just a willingness to modify or jettison theories in the light of empirical data. It would also include a willingness to re-examine the standards, methods, practices, techniques, and working assumptions of a discipline. A pre-condition for a scientific discipline to reach a consensus concerning a set of standards is that they first have some sense of participating in a common enterprise or endeavor. That is, they feel that they belong to a community of scientists who have agreed to discuss standards, methods, theories, and concepts, adopting some as conventions for the time being, revisiting and revising them as necessary. Some of these assumptions may be only tacit or implicit in the beginning, and are made explicit only as the result of argument and debate within the scientific community.
In the next chapter, McIntyre maintains that medicine transformed itself by adopting the scientific attitude in the late nineteenth and early twentieth centuries. Again, one could argue that McIntyre's example tells a slightly different story from the one he may have originally intended, for he eventually says that what really made the difference in medicine, at least in the United States, was when physicians no longer thought of themselves as individual practitioners, but as members of a profession, reading each other's publications, scrutinizing each other's practices, and shunning members who resisted new developments (131). It was not so much the scientific attitude alone, but the formation of a community that adopted this attitude, that made medicine a science.
For McIntyre, holding the scientific attitude is a matter of degree, with intentional fraud at one extreme, representing a total lack of this attitude. He concedes that "intentional" is hard to define. Yet he wants to maintain the distinction between fraud and other forms of research misconduct, so fraudsters will not be let off too easily, researchers will be able to perceive more clearly when they're risking fraud, and the scientific community will not become overly suspicious of any research misconduct. Andrew Wakefield's 1998 paper claiming a link between the MMR vaccine and autism provides an example of the harm that fraud can cause.
The longest chapter, 8, is devoted to defending science against denialism, pseudoscience, and charlatans. Denialism is the refusal to accept theories well supported by the evidence, while pseudoscience claims the mantle of science for theories concerning empirical matters but refuses to modify these theories in light of new evidence and methodological criticism. In actual cases, such as creationism and intelligent design theory, denialism and pseudoscience can overlap. Although denialists purport to be skeptics, they use doubt selectively, and are skeptical only of what they do not wish to be true, while gullibly accepting whatever agrees with their beliefs, much like conspiracy theorists. McIntyre cites Senator Ted Cruz's contention that the idea of human-caused climate change belongs to some sort of liberal plot as an example of someone who blends denialism and conspiracy theorizing. Scientific skepticism, on the other hand, does not rule out openness to new ideas. Pseudoscientists are not open to new ideas, especially new, potentially disconfirming evidence. This is an important chapter, as it concerns what McIntyre says motivated him to write the book in the first place. Still, I wonder about its intended audience. Would reading it help to change the mind of a climate denialist?
In the next chapter, McIntyre inquires whether the social sciences can follow the same path as medicine and transform themselves through adopting the scientific attitude. Politics and ideology tend to get in the way. Nevertheless, some social science research does exemplify the scientific attitude. He describes Sheena Iyengar and Mark Lepper's 1999 work, which showed that, contrary to an assumption often made by economists, people actually prefer to have fewer choices. But there is also weak social science research. He provides the example of Susan Fiske and Cydney Dupree's 2014 study of the lack of trust and respect that ordinary people have for scientists. However, in this case it turns out that the problem was not a lack of the proper attitude towards empirical evidence but rather the use of fuzzy concepts. Fiske and Dupree actually measured the degree of warmth people felt towards scientists, and treated warmth as synonymous with trust. They incorrectly reasoned that if a feeling of warmth implied trust, then a lack of this feeling would imply a lack of trust.
In conclusion, some of McIntyre's own examples reveal that there is more at work in science than simply a willingness to revise or even reject theories in light of new evidence. Good science requires a critical attitude towards concepts, methods, standards, and other assumptions as well. It also depends on a community of intellectuals who have agreed to support and encourage this attitude among themselves. The upshot of this book, then, is not all that different from some of what Helen Longino and Robert K. Merton have written. (McIntyre at least briefly mentions each.). For Longino, objective knowledge is produced by communities that have public standards and public venues for criticism, equality of intellectual authority, and "uptake," or the willingness to change one's views in response to criticism. The critical scientific attitude is also captured by Merton's norm of organized skepticism, one of four norms that he thought govern the scientific community, along with universalism, communality, and disinterestedness. Longino's and Merton's norms would appear to govern mathematics as well as the empirical sciences.
As I mentioned briefly earlier, the intended audience for this book is not entirely clear. McIntyre goes into a good bit of detail explaining the difference between necessary and sufficient conditions, giving the book the feel of an introductory textbook. On the other hand, his account of the p-hacking scandal in psychology presupposes a modicum of familiarity with statistical concepts. If it's used as a textbook, the professor may wish to challenge some of McIntyre's points, such as his notion that the ambiguity of experimental testing entails that there are "a potentially infinite number of hypotheses that could fit the data" (33). There are also claims regarding the history of astronomy that are inaccurate, and sweeping generalizations about the history of medicine, chemistry, and physics that any good instructor would want to challenge. Nevertheless, the book could be used with profit in an introductory philosophy of science course or a course about science and values.